00:00:00.001 Started by upstream project "autotest-per-patch" build number 127198 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.079 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:02.634 The recommended git tool is: git 00:00:02.635 using credential 00000000-0000-0000-0000-000000000002 00:00:02.638 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:02.646 Fetching changes from the remote Git repository 00:00:02.649 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:02.659 Using shallow fetch with depth 1 00:00:02.659 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:02.659 > git --version # timeout=10 00:00:02.670 > git --version # 'git version 2.39.2' 00:00:02.670 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:02.681 Setting http proxy: proxy-dmz.intel.com:911 00:00:02.681 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:08.347 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:08.359 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:08.371 Checking out Revision 4313f32deecbb7108199ebd1913b403a3005dece (FETCH_HEAD) 00:00:08.371 > git config core.sparsecheckout # timeout=10 00:00:08.381 > git read-tree -mu HEAD # timeout=10 00:00:08.400 > git checkout -f 4313f32deecbb7108199ebd1913b403a3005dece # timeout=5 00:00:08.424 Commit message: "packer: Add bios builder" 00:00:08.425 > git rev-list --no-walk 4313f32deecbb7108199ebd1913b403a3005dece # timeout=10 00:00:08.525 [Pipeline] Start of Pipeline 00:00:08.558 [Pipeline] library 00:00:08.566 Loading library shm_lib@master 00:00:08.566 Library shm_lib@master is cached. Copying from home. 00:00:08.593 [Pipeline] node 00:00:08.606 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:08.608 [Pipeline] { 00:00:08.616 [Pipeline] catchError 00:00:08.617 [Pipeline] { 00:00:08.626 [Pipeline] wrap 00:00:08.632 [Pipeline] { 00:00:08.638 [Pipeline] stage 00:00:08.639 [Pipeline] { (Prologue) 00:00:08.814 [Pipeline] sh 00:00:09.098 + logger -p user.info -t JENKINS-CI 00:00:09.115 [Pipeline] echo 00:00:09.117 Node: WFP50 00:00:09.122 [Pipeline] sh 00:00:09.416 [Pipeline] setCustomBuildProperty 00:00:09.425 [Pipeline] echo 00:00:09.427 Cleanup processes 00:00:09.432 [Pipeline] sh 00:00:09.713 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:09.713 969807 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:09.723 [Pipeline] sh 00:00:10.002 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:10.002 ++ grep -v 'sudo pgrep' 00:00:10.002 ++ awk '{print $1}' 00:00:10.002 + sudo kill -9 00:00:10.002 + true 00:00:10.015 [Pipeline] cleanWs 00:00:10.022 [WS-CLEANUP] Deleting project workspace... 00:00:10.022 [WS-CLEANUP] Deferred wipeout is used... 00:00:10.028 [WS-CLEANUP] done 00:00:10.031 [Pipeline] setCustomBuildProperty 00:00:10.042 [Pipeline] sh 00:00:10.318 + sudo git config --global --replace-all safe.directory '*' 00:00:10.403 [Pipeline] httpRequest 00:00:10.432 [Pipeline] echo 00:00:10.433 Sorcerer 10.211.164.101 is alive 00:00:10.441 [Pipeline] httpRequest 00:00:10.446 HttpMethod: GET 00:00:10.446 URL: http://10.211.164.101/packages/jbp_4313f32deecbb7108199ebd1913b403a3005dece.tar.gz 00:00:10.447 Sending request to url: http://10.211.164.101/packages/jbp_4313f32deecbb7108199ebd1913b403a3005dece.tar.gz 00:00:10.470 Response Code: HTTP/1.1 200 OK 00:00:10.470 Success: Status code 200 is in the accepted range: 200,404 00:00:10.470 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_4313f32deecbb7108199ebd1913b403a3005dece.tar.gz 00:00:22.884 [Pipeline] sh 00:00:23.164 + tar --no-same-owner -xf jbp_4313f32deecbb7108199ebd1913b403a3005dece.tar.gz 00:00:23.178 [Pipeline] httpRequest 00:00:23.196 [Pipeline] echo 00:00:23.197 Sorcerer 10.211.164.101 is alive 00:00:23.205 [Pipeline] httpRequest 00:00:23.209 HttpMethod: GET 00:00:23.209 URL: http://10.211.164.101/packages/spdk_f6e944e961677d24766395ed00d60a4f2139f30f.tar.gz 00:00:23.210 Sending request to url: http://10.211.164.101/packages/spdk_f6e944e961677d24766395ed00d60a4f2139f30f.tar.gz 00:00:23.234 Response Code: HTTP/1.1 200 OK 00:00:23.235 Success: Status code 200 is in the accepted range: 200,404 00:00:23.235 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_f6e944e961677d24766395ed00d60a4f2139f30f.tar.gz 00:03:14.960 [Pipeline] sh 00:03:15.237 + tar --no-same-owner -xf spdk_f6e944e961677d24766395ed00d60a4f2139f30f.tar.gz 00:03:19.438 [Pipeline] sh 00:03:19.720 + git -C spdk log --oneline -n5 00:03:19.720 f6e944e96 lib/reduce: if memory allocation fails, g_vol_count--. 00:03:19.720 764779691 bdev/compress: print error code information in load compress bdev 00:03:19.720 a7c420308 bdev/compress: release reduce vol resource when comp bdev fails to be created. 00:03:19.720 b8378f94e scripts/pkgdep: Set yum's skip_if_unavailable=True under rocky8 00:03:19.720 c2a77f51e module/bdev/nvme: add detach-monitor poller 00:03:19.732 [Pipeline] } 00:03:19.749 [Pipeline] // stage 00:03:19.759 [Pipeline] stage 00:03:19.762 [Pipeline] { (Prepare) 00:03:19.777 [Pipeline] writeFile 00:03:19.790 [Pipeline] sh 00:03:20.070 + logger -p user.info -t JENKINS-CI 00:03:20.082 [Pipeline] sh 00:03:20.365 + logger -p user.info -t JENKINS-CI 00:03:20.377 [Pipeline] sh 00:03:20.659 + cat autorun-spdk.conf 00:03:20.659 SPDK_RUN_FUNCTIONAL_TEST=1 00:03:20.659 SPDK_TEST_BLOCKDEV=1 00:03:20.659 SPDK_TEST_ISAL=1 00:03:20.659 SPDK_TEST_CRYPTO=1 00:03:20.659 SPDK_TEST_REDUCE=1 00:03:20.659 SPDK_TEST_VBDEV_COMPRESS=1 00:03:20.659 SPDK_RUN_UBSAN=1 00:03:20.659 SPDK_TEST_ACCEL=1 00:03:20.666 RUN_NIGHTLY=0 00:03:20.671 [Pipeline] readFile 00:03:20.699 [Pipeline] withEnv 00:03:20.701 [Pipeline] { 00:03:20.717 [Pipeline] sh 00:03:21.001 + set -ex 00:03:21.001 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:03:21.001 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:03:21.001 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:03:21.001 ++ SPDK_TEST_BLOCKDEV=1 00:03:21.001 ++ SPDK_TEST_ISAL=1 00:03:21.001 ++ SPDK_TEST_CRYPTO=1 00:03:21.001 ++ SPDK_TEST_REDUCE=1 00:03:21.001 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:03:21.001 ++ SPDK_RUN_UBSAN=1 00:03:21.001 ++ SPDK_TEST_ACCEL=1 00:03:21.001 ++ RUN_NIGHTLY=0 00:03:21.001 + case $SPDK_TEST_NVMF_NICS in 00:03:21.001 + DRIVERS= 00:03:21.001 + [[ -n '' ]] 00:03:21.001 + exit 0 00:03:21.045 [Pipeline] } 00:03:21.065 [Pipeline] // withEnv 00:03:21.070 [Pipeline] } 00:03:21.089 [Pipeline] // stage 00:03:21.100 [Pipeline] catchError 00:03:21.102 [Pipeline] { 00:03:21.119 [Pipeline] timeout 00:03:21.119 Timeout set to expire in 1 hr 0 min 00:03:21.121 [Pipeline] { 00:03:21.136 [Pipeline] stage 00:03:21.139 [Pipeline] { (Tests) 00:03:21.153 [Pipeline] sh 00:03:21.437 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:03:21.437 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:03:21.437 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:03:21.437 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:03:21.437 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:21.437 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:03:21.437 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:03:21.437 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:03:21.437 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:03:21.437 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:03:21.437 + [[ crypto-phy-autotest == pkgdep-* ]] 00:03:21.437 + cd /var/jenkins/workspace/crypto-phy-autotest 00:03:21.437 + source /etc/os-release 00:03:21.437 ++ NAME='Fedora Linux' 00:03:21.437 ++ VERSION='38 (Cloud Edition)' 00:03:21.437 ++ ID=fedora 00:03:21.437 ++ VERSION_ID=38 00:03:21.437 ++ VERSION_CODENAME= 00:03:21.437 ++ PLATFORM_ID=platform:f38 00:03:21.437 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:03:21.437 ++ ANSI_COLOR='0;38;2;60;110;180' 00:03:21.437 ++ LOGO=fedora-logo-icon 00:03:21.437 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:03:21.437 ++ HOME_URL=https://fedoraproject.org/ 00:03:21.438 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:03:21.438 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:03:21.438 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:03:21.438 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:03:21.438 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:03:21.438 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:03:21.438 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:03:21.438 ++ SUPPORT_END=2024-05-14 00:03:21.438 ++ VARIANT='Cloud Edition' 00:03:21.438 ++ VARIANT_ID=cloud 00:03:21.438 + uname -a 00:03:21.438 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:03:21.438 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:24.729 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:24.729 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:24.729 Hugepages 00:03:24.729 node hugesize free / total 00:03:24.729 node0 1048576kB 0 / 0 00:03:24.729 node0 2048kB 0 / 0 00:03:24.729 node1 1048576kB 0 / 0 00:03:24.729 node1 2048kB 0 / 0 00:03:24.729 00:03:24.729 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:24.729 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:03:24.729 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:03:24.729 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:03:24.729 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:03:24.729 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:03:24.729 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:03:24.729 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:03:24.729 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:03:24.729 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:03:24.729 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:03:24.729 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:03:24.729 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:03:24.729 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:03:24.729 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:03:24.729 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:03:24.729 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:03:24.729 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:03:24.729 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:03:24.729 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:03:24.729 + rm -f /tmp/spdk-ld-path 00:03:24.729 + source autorun-spdk.conf 00:03:24.729 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:03:24.729 ++ SPDK_TEST_BLOCKDEV=1 00:03:24.729 ++ SPDK_TEST_ISAL=1 00:03:24.729 ++ SPDK_TEST_CRYPTO=1 00:03:24.729 ++ SPDK_TEST_REDUCE=1 00:03:24.729 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:03:24.729 ++ SPDK_RUN_UBSAN=1 00:03:24.729 ++ SPDK_TEST_ACCEL=1 00:03:24.729 ++ RUN_NIGHTLY=0 00:03:24.729 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:03:24.729 + [[ -n '' ]] 00:03:24.729 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:24.729 + for M in /var/spdk/build-*-manifest.txt 00:03:24.729 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:03:24.729 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:03:24.729 + for M in /var/spdk/build-*-manifest.txt 00:03:24.729 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:03:24.729 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:03:24.729 ++ uname 00:03:24.729 + [[ Linux == \L\i\n\u\x ]] 00:03:24.729 + sudo dmesg -T 00:03:24.729 + sudo dmesg --clear 00:03:24.729 + dmesg_pid=971311 00:03:24.729 + [[ Fedora Linux == FreeBSD ]] 00:03:24.729 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:03:24.729 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:03:24.729 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:03:24.729 + [[ -x /usr/src/fio-static/fio ]] 00:03:24.729 + export FIO_BIN=/usr/src/fio-static/fio 00:03:24.729 + FIO_BIN=/usr/src/fio-static/fio 00:03:24.729 + sudo dmesg -Tw 00:03:24.729 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:03:24.729 + [[ ! -v VFIO_QEMU_BIN ]] 00:03:24.729 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:03:24.729 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:03:24.729 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:03:24.729 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:03:24.729 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:03:24.729 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:03:24.729 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:03:24.729 Test configuration: 00:03:24.729 SPDK_RUN_FUNCTIONAL_TEST=1 00:03:24.729 SPDK_TEST_BLOCKDEV=1 00:03:24.729 SPDK_TEST_ISAL=1 00:03:24.729 SPDK_TEST_CRYPTO=1 00:03:24.729 SPDK_TEST_REDUCE=1 00:03:24.729 SPDK_TEST_VBDEV_COMPRESS=1 00:03:24.729 SPDK_RUN_UBSAN=1 00:03:24.729 SPDK_TEST_ACCEL=1 00:03:24.729 RUN_NIGHTLY=0 05:31:39 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:24.729 05:31:39 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:03:24.729 05:31:39 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:24.729 05:31:39 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:24.729 05:31:39 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:24.729 05:31:39 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:24.729 05:31:39 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:24.729 05:31:39 -- paths/export.sh@5 -- $ export PATH 00:03:24.729 05:31:39 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:24.730 05:31:39 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:24.730 05:31:39 -- common/autobuild_common.sh@447 -- $ date +%s 00:03:24.730 05:31:39 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721964699.XXXXXX 00:03:24.730 05:31:39 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721964699.9oSJo7 00:03:24.730 05:31:39 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:03:24.730 05:31:39 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:03:24.730 05:31:39 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:03:24.730 05:31:39 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:03:24.730 05:31:39 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:03:24.730 05:31:39 -- common/autobuild_common.sh@463 -- $ get_config_params 00:03:24.730 05:31:39 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:03:24.730 05:31:39 -- common/autotest_common.sh@10 -- $ set +x 00:03:24.730 05:31:39 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:03:24.730 05:31:39 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:03:24.730 05:31:39 -- pm/common@17 -- $ local monitor 00:03:24.730 05:31:39 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.730 05:31:39 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.730 05:31:39 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.730 05:31:39 -- pm/common@21 -- $ date +%s 00:03:24.730 05:31:39 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:24.730 05:31:39 -- pm/common@21 -- $ date +%s 00:03:24.730 05:31:39 -- pm/common@25 -- $ sleep 1 00:03:24.730 05:31:39 -- pm/common@21 -- $ date +%s 00:03:24.730 05:31:39 -- pm/common@21 -- $ date +%s 00:03:24.730 05:31:39 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721964699 00:03:24.730 05:31:39 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721964699 00:03:24.730 05:31:39 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721964699 00:03:24.730 05:31:39 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721964699 00:03:24.989 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721964699_collect-vmstat.pm.log 00:03:24.989 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721964699_collect-cpu-load.pm.log 00:03:24.989 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721964699_collect-cpu-temp.pm.log 00:03:24.989 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721964699_collect-bmc-pm.bmc.pm.log 00:03:25.927 05:31:40 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:03:25.927 05:31:40 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:03:25.927 05:31:40 -- spdk/autobuild.sh@12 -- $ umask 022 00:03:25.927 05:31:40 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:25.927 05:31:40 -- spdk/autobuild.sh@16 -- $ date -u 00:03:25.927 Fri Jul 26 03:31:40 AM UTC 2024 00:03:25.927 05:31:40 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:03:25.927 v24.09-pre-305-gf6e944e96 00:03:25.927 05:31:40 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:03:25.927 05:31:40 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:03:25.927 05:31:40 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:03:25.927 05:31:40 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:03:25.927 05:31:40 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:25.927 05:31:40 -- common/autotest_common.sh@10 -- $ set +x 00:03:25.927 ************************************ 00:03:25.927 START TEST ubsan 00:03:25.927 ************************************ 00:03:25.927 05:31:40 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:03:25.927 using ubsan 00:03:25.927 00:03:25.927 real 0m0.001s 00:03:25.927 user 0m0.000s 00:03:25.927 sys 0m0.000s 00:03:25.927 05:31:40 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:03:25.927 05:31:40 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:03:25.927 ************************************ 00:03:25.927 END TEST ubsan 00:03:25.927 ************************************ 00:03:25.927 05:31:40 -- common/autotest_common.sh@1142 -- $ return 0 00:03:25.927 05:31:40 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:03:25.927 05:31:40 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:03:25.927 05:31:40 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:03:25.927 05:31:40 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:03:25.927 05:31:40 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:03:25.927 05:31:40 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:03:25.927 05:31:40 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:03:25.927 05:31:40 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:03:25.927 05:31:40 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:03:26.187 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:03:26.187 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:03:26.446 Using 'verbs' RDMA provider 00:03:42.714 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:57.599 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:57.599 Creating mk/config.mk...done. 00:03:57.599 Creating mk/cc.flags.mk...done. 00:03:57.599 Type 'make' to build. 00:03:57.599 05:32:10 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:03:57.599 05:32:10 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:03:57.599 05:32:10 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:57.599 05:32:10 -- common/autotest_common.sh@10 -- $ set +x 00:03:57.599 ************************************ 00:03:57.599 START TEST make 00:03:57.599 ************************************ 00:03:57.599 05:32:10 make -- common/autotest_common.sh@1123 -- $ make -j72 00:03:57.599 make[1]: Nothing to be done for 'all'. 00:04:36.348 The Meson build system 00:04:36.348 Version: 1.3.1 00:04:36.348 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:04:36.348 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:04:36.348 Build type: native build 00:04:36.348 Program cat found: YES (/usr/bin/cat) 00:04:36.348 Project name: DPDK 00:04:36.348 Project version: 24.03.0 00:04:36.348 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:04:36.348 C linker for the host machine: cc ld.bfd 2.39-16 00:04:36.348 Host machine cpu family: x86_64 00:04:36.348 Host machine cpu: x86_64 00:04:36.348 Message: ## Building in Developer Mode ## 00:04:36.348 Program pkg-config found: YES (/usr/bin/pkg-config) 00:04:36.348 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:04:36.348 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:04:36.348 Program python3 found: YES (/usr/bin/python3) 00:04:36.348 Program cat found: YES (/usr/bin/cat) 00:04:36.348 Compiler for C supports arguments -march=native: YES 00:04:36.348 Checking for size of "void *" : 8 00:04:36.348 Checking for size of "void *" : 8 (cached) 00:04:36.348 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:04:36.348 Library m found: YES 00:04:36.348 Library numa found: YES 00:04:36.348 Has header "numaif.h" : YES 00:04:36.348 Library fdt found: NO 00:04:36.348 Library execinfo found: NO 00:04:36.348 Has header "execinfo.h" : YES 00:04:36.348 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:04:36.348 Run-time dependency libarchive found: NO (tried pkgconfig) 00:04:36.348 Run-time dependency libbsd found: NO (tried pkgconfig) 00:04:36.348 Run-time dependency jansson found: NO (tried pkgconfig) 00:04:36.348 Run-time dependency openssl found: YES 3.0.9 00:04:36.348 Run-time dependency libpcap found: YES 1.10.4 00:04:36.348 Has header "pcap.h" with dependency libpcap: YES 00:04:36.348 Compiler for C supports arguments -Wcast-qual: YES 00:04:36.348 Compiler for C supports arguments -Wdeprecated: YES 00:04:36.348 Compiler for C supports arguments -Wformat: YES 00:04:36.348 Compiler for C supports arguments -Wformat-nonliteral: NO 00:04:36.348 Compiler for C supports arguments -Wformat-security: NO 00:04:36.348 Compiler for C supports arguments -Wmissing-declarations: YES 00:04:36.348 Compiler for C supports arguments -Wmissing-prototypes: YES 00:04:36.348 Compiler for C supports arguments -Wnested-externs: YES 00:04:36.348 Compiler for C supports arguments -Wold-style-definition: YES 00:04:36.348 Compiler for C supports arguments -Wpointer-arith: YES 00:04:36.348 Compiler for C supports arguments -Wsign-compare: YES 00:04:36.348 Compiler for C supports arguments -Wstrict-prototypes: YES 00:04:36.348 Compiler for C supports arguments -Wundef: YES 00:04:36.348 Compiler for C supports arguments -Wwrite-strings: YES 00:04:36.348 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:04:36.348 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:04:36.348 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:04:36.348 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:04:36.348 Program objdump found: YES (/usr/bin/objdump) 00:04:36.348 Compiler for C supports arguments -mavx512f: YES 00:04:36.348 Checking if "AVX512 checking" compiles: YES 00:04:36.348 Fetching value of define "__SSE4_2__" : 1 00:04:36.348 Fetching value of define "__AES__" : 1 00:04:36.348 Fetching value of define "__AVX__" : 1 00:04:36.348 Fetching value of define "__AVX2__" : 1 00:04:36.348 Fetching value of define "__AVX512BW__" : 1 00:04:36.348 Fetching value of define "__AVX512CD__" : 1 00:04:36.348 Fetching value of define "__AVX512DQ__" : 1 00:04:36.348 Fetching value of define "__AVX512F__" : 1 00:04:36.348 Fetching value of define "__AVX512VL__" : 1 00:04:36.348 Fetching value of define "__PCLMUL__" : 1 00:04:36.348 Fetching value of define "__RDRND__" : 1 00:04:36.348 Fetching value of define "__RDSEED__" : 1 00:04:36.348 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:04:36.348 Fetching value of define "__znver1__" : (undefined) 00:04:36.348 Fetching value of define "__znver2__" : (undefined) 00:04:36.348 Fetching value of define "__znver3__" : (undefined) 00:04:36.348 Fetching value of define "__znver4__" : (undefined) 00:04:36.348 Compiler for C supports arguments -Wno-format-truncation: YES 00:04:36.348 Message: lib/log: Defining dependency "log" 00:04:36.348 Message: lib/kvargs: Defining dependency "kvargs" 00:04:36.348 Message: lib/telemetry: Defining dependency "telemetry" 00:04:36.348 Checking for function "getentropy" : NO 00:04:36.348 Message: lib/eal: Defining dependency "eal" 00:04:36.348 Message: lib/ring: Defining dependency "ring" 00:04:36.348 Message: lib/rcu: Defining dependency "rcu" 00:04:36.348 Message: lib/mempool: Defining dependency "mempool" 00:04:36.348 Message: lib/mbuf: Defining dependency "mbuf" 00:04:36.348 Fetching value of define "__PCLMUL__" : 1 (cached) 00:04:36.348 Fetching value of define "__AVX512F__" : 1 (cached) 00:04:36.348 Fetching value of define "__AVX512BW__" : 1 (cached) 00:04:36.348 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:04:36.348 Fetching value of define "__AVX512VL__" : 1 (cached) 00:04:36.348 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:04:36.348 Compiler for C supports arguments -mpclmul: YES 00:04:36.348 Compiler for C supports arguments -maes: YES 00:04:36.348 Compiler for C supports arguments -mavx512f: YES (cached) 00:04:36.348 Compiler for C supports arguments -mavx512bw: YES 00:04:36.348 Compiler for C supports arguments -mavx512dq: YES 00:04:36.348 Compiler for C supports arguments -mavx512vl: YES 00:04:36.348 Compiler for C supports arguments -mvpclmulqdq: YES 00:04:36.348 Compiler for C supports arguments -mavx2: YES 00:04:36.348 Compiler for C supports arguments -mavx: YES 00:04:36.348 Message: lib/net: Defining dependency "net" 00:04:36.348 Message: lib/meter: Defining dependency "meter" 00:04:36.348 Message: lib/ethdev: Defining dependency "ethdev" 00:04:36.348 Message: lib/pci: Defining dependency "pci" 00:04:36.348 Message: lib/cmdline: Defining dependency "cmdline" 00:04:36.348 Message: lib/hash: Defining dependency "hash" 00:04:36.348 Message: lib/timer: Defining dependency "timer" 00:04:36.348 Message: lib/compressdev: Defining dependency "compressdev" 00:04:36.348 Message: lib/cryptodev: Defining dependency "cryptodev" 00:04:36.348 Message: lib/dmadev: Defining dependency "dmadev" 00:04:36.348 Compiler for C supports arguments -Wno-cast-qual: YES 00:04:36.348 Message: lib/power: Defining dependency "power" 00:04:36.348 Message: lib/reorder: Defining dependency "reorder" 00:04:36.348 Message: lib/security: Defining dependency "security" 00:04:36.348 Has header "linux/userfaultfd.h" : YES 00:04:36.348 Has header "linux/vduse.h" : YES 00:04:36.348 Message: lib/vhost: Defining dependency "vhost" 00:04:36.348 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:04:36.348 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:04:36.348 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:04:36.348 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:04:36.348 Compiler for C supports arguments -std=c11: YES 00:04:36.348 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:04:36.348 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:04:36.348 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:04:36.348 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:04:36.348 Run-time dependency libmlx5 found: YES 1.24.44.0 00:04:36.348 Run-time dependency libibverbs found: YES 1.14.44.0 00:04:36.348 Library mtcr_ul found: NO 00:04:36.348 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:04:36.348 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:04:36.348 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:04:36.348 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:04:36.348 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:04:36.348 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:04:36.348 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:04:36.348 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:04:36.348 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:04:36.349 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:04:36.349 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:04:36.349 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:04:36.349 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:04:36.349 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:04:36.349 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:04:37.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:04:37.344 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:04:37.344 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:04:37.344 Configuring mlx5_autoconf.h using configuration 00:04:37.344 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:04:37.344 Run-time dependency libcrypto found: YES 3.0.9 00:04:37.344 Library IPSec_MB found: YES 00:04:37.344 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:04:37.344 Message: drivers/common/qat: Defining dependency "common_qat" 00:04:37.344 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:04:37.344 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:04:37.344 Library IPSec_MB found: YES 00:04:37.344 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:04:37.344 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:04:37.344 Compiler for C supports arguments -std=c11: YES (cached) 00:04:37.344 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:04:37.344 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:04:37.344 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:04:37.344 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:04:37.344 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:04:37.344 Run-time dependency libisal found: NO (tried pkgconfig) 00:04:37.344 Library libisal found: NO 00:04:37.344 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:04:37.344 Compiler for C supports arguments -std=c11: YES (cached) 00:04:37.344 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:04:37.344 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:04:37.344 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:04:37.344 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:04:37.344 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:04:37.344 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:04:37.344 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:04:37.344 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:04:37.344 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:04:37.344 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:04:37.344 Program doxygen found: YES (/usr/bin/doxygen) 00:04:37.344 Configuring doxy-api-html.conf using configuration 00:04:37.344 Configuring doxy-api-man.conf using configuration 00:04:37.344 Program mandb found: YES (/usr/bin/mandb) 00:04:37.344 Program sphinx-build found: NO 00:04:37.344 Configuring rte_build_config.h using configuration 00:04:37.344 Message: 00:04:37.344 ================= 00:04:37.344 Applications Enabled 00:04:37.344 ================= 00:04:37.344 00:04:37.344 apps: 00:04:37.344 00:04:37.344 00:04:37.344 Message: 00:04:37.344 ================= 00:04:37.344 Libraries Enabled 00:04:37.344 ================= 00:04:37.344 00:04:37.344 libs: 00:04:37.344 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:04:37.344 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:04:37.344 cryptodev, dmadev, power, reorder, security, vhost, 00:04:37.344 00:04:37.344 Message: 00:04:37.344 =============== 00:04:37.344 Drivers Enabled 00:04:37.344 =============== 00:04:37.344 00:04:37.344 common: 00:04:37.344 mlx5, qat, 00:04:37.344 bus: 00:04:37.344 auxiliary, pci, vdev, 00:04:37.344 mempool: 00:04:37.344 ring, 00:04:37.344 dma: 00:04:37.344 00:04:37.344 net: 00:04:37.344 00:04:37.344 crypto: 00:04:37.344 ipsec_mb, mlx5, 00:04:37.344 compress: 00:04:37.344 isal, mlx5, 00:04:37.344 vdpa: 00:04:37.344 00:04:37.344 00:04:37.344 Message: 00:04:37.344 ================= 00:04:37.344 Content Skipped 00:04:37.344 ================= 00:04:37.344 00:04:37.344 apps: 00:04:37.344 dumpcap: explicitly disabled via build config 00:04:37.344 graph: explicitly disabled via build config 00:04:37.344 pdump: explicitly disabled via build config 00:04:37.344 proc-info: explicitly disabled via build config 00:04:37.344 test-acl: explicitly disabled via build config 00:04:37.344 test-bbdev: explicitly disabled via build config 00:04:37.344 test-cmdline: explicitly disabled via build config 00:04:37.344 test-compress-perf: explicitly disabled via build config 00:04:37.344 test-crypto-perf: explicitly disabled via build config 00:04:37.344 test-dma-perf: explicitly disabled via build config 00:04:37.344 test-eventdev: explicitly disabled via build config 00:04:37.344 test-fib: explicitly disabled via build config 00:04:37.345 test-flow-perf: explicitly disabled via build config 00:04:37.345 test-gpudev: explicitly disabled via build config 00:04:37.345 test-mldev: explicitly disabled via build config 00:04:37.345 test-pipeline: explicitly disabled via build config 00:04:37.345 test-pmd: explicitly disabled via build config 00:04:37.345 test-regex: explicitly disabled via build config 00:04:37.345 test-sad: explicitly disabled via build config 00:04:37.345 test-security-perf: explicitly disabled via build config 00:04:37.345 00:04:37.345 libs: 00:04:37.345 argparse: explicitly disabled via build config 00:04:37.345 metrics: explicitly disabled via build config 00:04:37.345 acl: explicitly disabled via build config 00:04:37.345 bbdev: explicitly disabled via build config 00:04:37.345 bitratestats: explicitly disabled via build config 00:04:37.345 bpf: explicitly disabled via build config 00:04:37.345 cfgfile: explicitly disabled via build config 00:04:37.345 distributor: explicitly disabled via build config 00:04:37.345 efd: explicitly disabled via build config 00:04:37.345 eventdev: explicitly disabled via build config 00:04:37.345 dispatcher: explicitly disabled via build config 00:04:37.345 gpudev: explicitly disabled via build config 00:04:37.345 gro: explicitly disabled via build config 00:04:37.345 gso: explicitly disabled via build config 00:04:37.345 ip_frag: explicitly disabled via build config 00:04:37.345 jobstats: explicitly disabled via build config 00:04:37.345 latencystats: explicitly disabled via build config 00:04:37.345 lpm: explicitly disabled via build config 00:04:37.345 member: explicitly disabled via build config 00:04:37.345 pcapng: explicitly disabled via build config 00:04:37.345 rawdev: explicitly disabled via build config 00:04:37.345 regexdev: explicitly disabled via build config 00:04:37.345 mldev: explicitly disabled via build config 00:04:37.345 rib: explicitly disabled via build config 00:04:37.345 sched: explicitly disabled via build config 00:04:37.345 stack: explicitly disabled via build config 00:04:37.345 ipsec: explicitly disabled via build config 00:04:37.345 pdcp: explicitly disabled via build config 00:04:37.345 fib: explicitly disabled via build config 00:04:37.345 port: explicitly disabled via build config 00:04:37.345 pdump: explicitly disabled via build config 00:04:37.345 table: explicitly disabled via build config 00:04:37.345 pipeline: explicitly disabled via build config 00:04:37.345 graph: explicitly disabled via build config 00:04:37.345 node: explicitly disabled via build config 00:04:37.345 00:04:37.345 drivers: 00:04:37.345 common/cpt: not in enabled drivers build config 00:04:37.345 common/dpaax: not in enabled drivers build config 00:04:37.345 common/iavf: not in enabled drivers build config 00:04:37.345 common/idpf: not in enabled drivers build config 00:04:37.345 common/ionic: not in enabled drivers build config 00:04:37.345 common/mvep: not in enabled drivers build config 00:04:37.345 common/octeontx: not in enabled drivers build config 00:04:37.345 bus/cdx: not in enabled drivers build config 00:04:37.345 bus/dpaa: not in enabled drivers build config 00:04:37.345 bus/fslmc: not in enabled drivers build config 00:04:37.345 bus/ifpga: not in enabled drivers build config 00:04:37.345 bus/platform: not in enabled drivers build config 00:04:37.345 bus/uacce: not in enabled drivers build config 00:04:37.345 bus/vmbus: not in enabled drivers build config 00:04:37.345 common/cnxk: not in enabled drivers build config 00:04:37.345 common/nfp: not in enabled drivers build config 00:04:37.345 common/nitrox: not in enabled drivers build config 00:04:37.345 common/sfc_efx: not in enabled drivers build config 00:04:37.345 mempool/bucket: not in enabled drivers build config 00:04:37.345 mempool/cnxk: not in enabled drivers build config 00:04:37.345 mempool/dpaa: not in enabled drivers build config 00:04:37.345 mempool/dpaa2: not in enabled drivers build config 00:04:37.345 mempool/octeontx: not in enabled drivers build config 00:04:37.345 mempool/stack: not in enabled drivers build config 00:04:37.345 dma/cnxk: not in enabled drivers build config 00:04:37.345 dma/dpaa: not in enabled drivers build config 00:04:37.345 dma/dpaa2: not in enabled drivers build config 00:04:37.345 dma/hisilicon: not in enabled drivers build config 00:04:37.345 dma/idxd: not in enabled drivers build config 00:04:37.345 dma/ioat: not in enabled drivers build config 00:04:37.345 dma/skeleton: not in enabled drivers build config 00:04:37.345 net/af_packet: not in enabled drivers build config 00:04:37.345 net/af_xdp: not in enabled drivers build config 00:04:37.345 net/ark: not in enabled drivers build config 00:04:37.345 net/atlantic: not in enabled drivers build config 00:04:37.345 net/avp: not in enabled drivers build config 00:04:37.345 net/axgbe: not in enabled drivers build config 00:04:37.345 net/bnx2x: not in enabled drivers build config 00:04:37.345 net/bnxt: not in enabled drivers build config 00:04:37.345 net/bonding: not in enabled drivers build config 00:04:37.345 net/cnxk: not in enabled drivers build config 00:04:37.345 net/cpfl: not in enabled drivers build config 00:04:37.345 net/cxgbe: not in enabled drivers build config 00:04:37.345 net/dpaa: not in enabled drivers build config 00:04:37.345 net/dpaa2: not in enabled drivers build config 00:04:37.345 net/e1000: not in enabled drivers build config 00:04:37.345 net/ena: not in enabled drivers build config 00:04:37.345 net/enetc: not in enabled drivers build config 00:04:37.345 net/enetfec: not in enabled drivers build config 00:04:37.345 net/enic: not in enabled drivers build config 00:04:37.345 net/failsafe: not in enabled drivers build config 00:04:37.345 net/fm10k: not in enabled drivers build config 00:04:37.345 net/gve: not in enabled drivers build config 00:04:37.345 net/hinic: not in enabled drivers build config 00:04:37.345 net/hns3: not in enabled drivers build config 00:04:37.345 net/i40e: not in enabled drivers build config 00:04:37.345 net/iavf: not in enabled drivers build config 00:04:37.345 net/ice: not in enabled drivers build config 00:04:37.345 net/idpf: not in enabled drivers build config 00:04:37.345 net/igc: not in enabled drivers build config 00:04:37.345 net/ionic: not in enabled drivers build config 00:04:37.345 net/ipn3ke: not in enabled drivers build config 00:04:37.345 net/ixgbe: not in enabled drivers build config 00:04:37.345 net/mana: not in enabled drivers build config 00:04:37.345 net/memif: not in enabled drivers build config 00:04:37.345 net/mlx4: not in enabled drivers build config 00:04:37.345 net/mlx5: not in enabled drivers build config 00:04:37.345 net/mvneta: not in enabled drivers build config 00:04:37.345 net/mvpp2: not in enabled drivers build config 00:04:37.345 net/netvsc: not in enabled drivers build config 00:04:37.345 net/nfb: not in enabled drivers build config 00:04:37.345 net/nfp: not in enabled drivers build config 00:04:37.345 net/ngbe: not in enabled drivers build config 00:04:37.345 net/null: not in enabled drivers build config 00:04:37.345 net/octeontx: not in enabled drivers build config 00:04:37.345 net/octeon_ep: not in enabled drivers build config 00:04:37.345 net/pcap: not in enabled drivers build config 00:04:37.345 net/pfe: not in enabled drivers build config 00:04:37.345 net/qede: not in enabled drivers build config 00:04:37.345 net/ring: not in enabled drivers build config 00:04:37.345 net/sfc: not in enabled drivers build config 00:04:37.345 net/softnic: not in enabled drivers build config 00:04:37.345 net/tap: not in enabled drivers build config 00:04:37.345 net/thunderx: not in enabled drivers build config 00:04:37.345 net/txgbe: not in enabled drivers build config 00:04:37.345 net/vdev_netvsc: not in enabled drivers build config 00:04:37.345 net/vhost: not in enabled drivers build config 00:04:37.345 net/virtio: not in enabled drivers build config 00:04:37.345 net/vmxnet3: not in enabled drivers build config 00:04:37.345 raw/*: missing internal dependency, "rawdev" 00:04:37.345 crypto/armv8: not in enabled drivers build config 00:04:37.345 crypto/bcmfs: not in enabled drivers build config 00:04:37.345 crypto/caam_jr: not in enabled drivers build config 00:04:37.345 crypto/ccp: not in enabled drivers build config 00:04:37.345 crypto/cnxk: not in enabled drivers build config 00:04:37.345 crypto/dpaa_sec: not in enabled drivers build config 00:04:37.345 crypto/dpaa2_sec: not in enabled drivers build config 00:04:37.345 crypto/mvsam: not in enabled drivers build config 00:04:37.345 crypto/nitrox: not in enabled drivers build config 00:04:37.345 crypto/null: not in enabled drivers build config 00:04:37.345 crypto/octeontx: not in enabled drivers build config 00:04:37.345 crypto/openssl: not in enabled drivers build config 00:04:37.345 crypto/scheduler: not in enabled drivers build config 00:04:37.345 crypto/uadk: not in enabled drivers build config 00:04:37.345 crypto/virtio: not in enabled drivers build config 00:04:37.345 compress/nitrox: not in enabled drivers build config 00:04:37.345 compress/octeontx: not in enabled drivers build config 00:04:37.345 compress/zlib: not in enabled drivers build config 00:04:37.345 regex/*: missing internal dependency, "regexdev" 00:04:37.345 ml/*: missing internal dependency, "mldev" 00:04:37.345 vdpa/ifc: not in enabled drivers build config 00:04:37.345 vdpa/mlx5: not in enabled drivers build config 00:04:37.345 vdpa/nfp: not in enabled drivers build config 00:04:37.345 vdpa/sfc: not in enabled drivers build config 00:04:37.345 event/*: missing internal dependency, "eventdev" 00:04:37.345 baseband/*: missing internal dependency, "bbdev" 00:04:37.345 gpu/*: missing internal dependency, "gpudev" 00:04:37.345 00:04:37.345 00:04:37.605 Build targets in project: 115 00:04:37.605 00:04:37.605 DPDK 24.03.0 00:04:37.605 00:04:37.605 User defined options 00:04:37.605 buildtype : debug 00:04:37.605 default_library : shared 00:04:37.605 libdir : lib 00:04:37.605 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:04:37.605 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:04:37.605 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:04:37.605 cpu_instruction_set: native 00:04:37.605 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:04:37.605 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,argparse,pcapng,bbdev 00:04:37.605 enable_docs : false 00:04:37.605 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:04:37.605 enable_kmods : false 00:04:37.605 max_lcores : 128 00:04:37.605 tests : false 00:04:37.605 00:04:37.605 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:04:38.178 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:04:38.178 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:04:38.178 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:04:38.178 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:04:38.178 [4/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:04:38.178 [5/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:04:38.178 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:04:38.178 [7/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:04:38.178 [8/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:04:38.439 [9/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:04:38.439 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:04:38.439 [11/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:04:38.439 [12/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:04:38.439 [13/378] Linking static target lib/librte_kvargs.a 00:04:38.439 [14/378] Linking static target lib/librte_log.a 00:04:38.439 [15/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:04:38.439 [16/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:04:38.439 [17/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:04:38.439 [18/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:04:38.439 [19/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:04:38.698 [20/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:04:38.698 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:04:38.698 [22/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:04:38.698 [23/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:04:38.698 [24/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:04:38.698 [25/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:04:38.698 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:04:38.698 [27/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:04:38.698 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:04:38.698 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:04:38.698 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:04:38.698 [31/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:04:38.961 [32/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:04:38.961 [33/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:04:38.961 [34/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:04:38.961 [35/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:04:38.961 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:04:38.961 [37/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:04:38.961 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:04:38.961 [39/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:04:38.961 [40/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:04:38.961 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:04:38.961 [42/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:04:38.961 [43/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:04:38.961 [44/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:04:38.961 [45/378] Linking static target lib/librte_telemetry.a 00:04:38.961 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:04:38.961 [47/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:04:38.961 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:04:38.961 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:04:38.961 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:04:38.961 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:04:38.961 [52/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:04:38.961 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:04:38.961 [54/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:04:38.961 [55/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:04:38.961 [56/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:04:38.961 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:04:38.961 [58/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:04:38.961 [59/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:04:38.961 [60/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:04:38.961 [61/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:04:38.961 [62/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:04:38.961 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:04:38.961 [64/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:04:38.961 [65/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:04:38.961 [66/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:04:38.961 [67/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:04:38.961 [68/378] Linking static target lib/librte_ring.a 00:04:38.961 [69/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:04:38.961 [70/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:04:38.961 [71/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:04:38.961 [72/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:04:38.961 [73/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:04:38.961 [74/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:04:38.961 [75/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:04:38.961 [76/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:04:38.961 [77/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:04:38.961 [78/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:04:38.961 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:04:38.961 [80/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:04:38.961 [81/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:04:38.961 [82/378] Linking static target lib/librte_pci.a 00:04:38.961 [83/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:04:38.961 [84/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:04:38.961 [85/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:04:38.961 [86/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:04:38.961 [87/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:04:38.961 [88/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:04:38.961 [89/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:04:38.961 [90/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:04:38.961 [91/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:04:38.961 [92/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:04:39.229 [93/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:04:39.229 [94/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:04:39.229 [95/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:04:39.229 [96/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:04:39.229 [97/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:04:39.229 [98/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:04:39.229 [99/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:04:39.229 [100/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:04:39.229 [101/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:04:39.229 [102/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:04:39.229 [103/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:04:39.229 [104/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:04:39.229 [105/378] Linking static target lib/librte_mempool.a 00:04:39.229 [106/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:04:39.229 [107/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:04:39.229 [108/378] Linking static target lib/librte_rcu.a 00:04:39.229 [109/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:04:39.229 [110/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:04:39.229 [111/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:04:39.229 [112/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:04:39.229 [113/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:04:39.229 [114/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:04:39.229 [115/378] Linking static target lib/librte_meter.a 00:04:39.229 [116/378] Linking target lib/librte_log.so.24.1 00:04:39.229 [117/378] Linking static target lib/librte_net.a 00:04:39.499 [118/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:04:39.499 [119/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:04:39.499 [120/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:04:39.499 [121/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:04:39.499 [122/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:04:39.499 [123/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:04:39.499 [124/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:04:39.499 [125/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:04:39.499 [126/378] Linking static target lib/librte_mbuf.a 00:04:39.499 [127/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:04:39.499 [128/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:04:39.499 [129/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:04:39.499 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:04:39.499 [131/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:04:39.499 [132/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:04:39.499 [133/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:04:39.499 [134/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:04:39.499 [135/378] Linking static target lib/librte_cmdline.a 00:04:39.499 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:04:39.499 [137/378] Linking target lib/librte_kvargs.so.24.1 00:04:39.761 [138/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:04:39.761 [139/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:04:39.761 [140/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:04:39.761 [141/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:04:39.761 [142/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:04:39.761 [143/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:04:39.761 [144/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:04:39.761 [145/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:04:39.761 [146/378] Linking static target lib/librte_timer.a 00:04:39.761 [147/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:04:39.761 [148/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:04:39.761 [149/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:04:39.761 [150/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:04:39.761 [151/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:04:39.761 [152/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:04:39.761 [153/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:04:39.761 [154/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:04:39.761 [155/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:04:39.761 [156/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:04:39.761 [157/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:04:39.761 [158/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:04:39.761 [159/378] Linking target lib/librte_telemetry.so.24.1 00:04:39.761 [160/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:04:39.761 [161/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:04:39.761 [162/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:04:39.761 [163/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:04:39.761 [164/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:04:39.761 [165/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:04:39.761 [166/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:04:39.761 [167/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:04:39.761 [168/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:04:39.761 [169/378] Linking static target lib/librte_compressdev.a 00:04:39.761 [170/378] Linking static target lib/librte_dmadev.a 00:04:39.761 [171/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:04:39.761 [172/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:04:39.762 [173/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:04:39.762 [174/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:04:39.762 [175/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:04:39.762 [176/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:04:40.020 [177/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:04:40.020 [178/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:04:40.020 [179/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:04:40.020 [180/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:04:40.020 [181/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:04:40.020 [182/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:04:40.020 [183/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:04:40.020 [184/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:04:40.020 [185/378] Linking static target lib/librte_power.a 00:04:40.020 [186/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:04:40.020 [187/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:04:40.020 [188/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:04:40.020 [189/378] Linking static target lib/librte_eal.a 00:04:40.020 [190/378] Linking static target lib/librte_reorder.a 00:04:40.020 [191/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:04:40.020 [192/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:04:40.020 [193/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:04:40.020 [194/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:04:40.020 [195/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:04:40.020 [196/378] Linking static target lib/librte_security.a 00:04:40.020 [197/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:04:40.020 [198/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:04:40.280 [199/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:04:40.280 [200/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:04:40.280 [201/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:04:40.280 [202/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:04:40.280 [203/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:04:40.280 [204/378] Linking static target drivers/librte_bus_auxiliary.a 00:04:40.280 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:04:40.280 [206/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:04:40.280 [207/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:04:40.280 [208/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:04:40.280 [209/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:04:40.280 [210/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:04:40.280 [211/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:04:40.280 [212/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:04:40.280 [213/378] Linking static target lib/librte_hash.a 00:04:40.280 [214/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:04:40.280 [215/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:04:40.280 [216/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.280 [217/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:04:40.280 [218/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:04:40.280 [219/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:04:40.280 [220/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.280 [221/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:04:40.280 [222/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:04:40.280 [223/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:04:40.280 [224/378] Linking static target drivers/librte_bus_vdev.a 00:04:40.539 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:04:40.539 [226/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:04:40.539 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:04:40.539 [228/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:04:40.539 [229/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:04:40.539 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:04:40.539 [231/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:04:40.539 [232/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:04:40.539 [233/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:04:40.539 [234/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:04:40.539 [235/378] Linking static target drivers/librte_bus_pci.a 00:04:40.539 [236/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:04:40.539 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:04:40.539 [238/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:04:40.539 [239/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.539 [240/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.539 [241/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:04:40.539 [242/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:04:40.539 [243/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:04:40.539 [244/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:04:40.539 [245/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:04:40.539 [246/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:04:40.539 [247/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:04:40.539 [248/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:04:40.539 [249/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.539 [250/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:04:40.539 [251/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:04:40.539 [252/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:04:40.539 [253/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:04:40.539 [254/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.539 [255/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.539 [256/378] Linking static target lib/librte_cryptodev.a 00:04:40.539 [257/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:04:40.539 [258/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:04:40.539 [259/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.539 [260/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:04:40.797 [261/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:04:40.797 [262/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:04:40.797 [263/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.797 [264/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:04:40.797 [265/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:04:40.797 [266/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:04:40.797 [267/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:04:40.797 [268/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:04:40.797 [269/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:04:40.797 [270/378] Linking static target drivers/librte_mempool_ring.a 00:04:40.797 [271/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:04:40.797 [272/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:04:40.798 [273/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:04:40.798 [274/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:04:40.798 [275/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:04:40.798 [276/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:04:40.798 [277/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:04:41.056 [278/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:04:41.056 [279/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:04:41.056 [280/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:04:41.056 [281/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:04:41.056 [282/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:04:41.056 [283/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:04:41.056 [284/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:04:41.056 [285/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:04:41.056 [286/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:04:41.056 [287/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:04:41.056 [288/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:04:41.056 [289/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:04:41.056 [290/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:04:41.056 [291/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:04:41.056 [292/378] Linking static target drivers/librte_compress_mlx5.a 00:04:41.056 [293/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:04:41.056 [294/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:04:41.056 [295/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:04:41.316 [296/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:04:41.316 [297/378] Linking static target lib/librte_ethdev.a 00:04:41.316 [298/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:04:41.316 [299/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:04:41.316 [300/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:04:41.316 [301/378] Linking static target drivers/librte_compress_isal.a 00:04:41.316 [302/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:04:41.316 [303/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:04:41.316 [304/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:04:41.316 [305/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:04:41.316 [306/378] Linking static target drivers/librte_common_mlx5.a 00:04:41.316 [307/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:04:41.316 [308/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:04:41.316 [309/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:04:41.316 [310/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:04:41.316 [311/378] Linking static target drivers/librte_crypto_mlx5.a 00:04:41.884 [312/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:04:41.884 [313/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:04:41.884 [314/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:04:41.884 [315/378] Linking static target drivers/libtmp_rte_common_qat.a 00:04:41.884 [316/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:04:42.143 [317/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:04:42.143 [318/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:04:42.143 [319/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:04:42.143 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:04:42.143 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:04:42.143 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:04:42.402 [323/378] Linking static target drivers/librte_common_qat.a 00:04:42.402 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:04:42.660 [325/378] Linking static target lib/librte_vhost.a 00:04:42.661 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:45.197 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:04:47.099 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:04:51.293 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:53.195 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:04:53.195 [331/378] Linking target lib/librte_eal.so.24.1 00:04:53.195 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:04:53.195 [333/378] Linking target lib/librte_meter.so.24.1 00:04:53.195 [334/378] Linking target lib/librte_ring.so.24.1 00:04:53.195 [335/378] Linking target lib/librte_pci.so.24.1 00:04:53.195 [336/378] Linking target lib/librte_timer.so.24.1 00:04:53.195 [337/378] Linking target drivers/librte_bus_vdev.so.24.1 00:04:53.195 [338/378] Linking target lib/librte_dmadev.so.24.1 00:04:53.195 [339/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:04:53.454 [340/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:04:53.454 [341/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:04:53.454 [342/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:04:53.454 [343/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:04:53.454 [344/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:04:53.454 [345/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:04:53.454 [346/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:04:53.454 [347/378] Linking target lib/librte_rcu.so.24.1 00:04:53.454 [348/378] Linking target lib/librte_mempool.so.24.1 00:04:53.454 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:04:53.454 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:04:53.454 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:04:53.454 [352/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:04:53.731 [353/378] Linking target lib/librte_mbuf.so.24.1 00:04:53.731 [354/378] Linking target drivers/librte_mempool_ring.so.24.1 00:04:53.731 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:04:53.731 [356/378] Linking target lib/librte_net.so.24.1 00:04:53.731 [357/378] Linking target lib/librte_compressdev.so.24.1 00:04:53.731 [358/378] Linking target lib/librte_reorder.so.24.1 00:04:53.731 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:04:53.993 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:04:53.993 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:04:53.993 [362/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:04:53.993 [363/378] Linking target drivers/librte_compress_isal.so.24.1 00:04:53.993 [364/378] Linking target lib/librte_hash.so.24.1 00:04:53.993 [365/378] Linking target lib/librte_security.so.24.1 00:04:53.993 [366/378] Linking target lib/librte_cmdline.so.24.1 00:04:53.993 [367/378] Linking target lib/librte_ethdev.so.24.1 00:04:54.251 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:04:54.251 [369/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:04:54.251 [370/378] Linking target lib/librte_power.so.24.1 00:04:54.251 [371/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:04:54.251 [372/378] Linking target drivers/librte_common_mlx5.so.24.1 00:04:54.251 [373/378] Linking target lib/librte_vhost.so.24.1 00:04:54.510 [374/378] Linking target drivers/librte_common_qat.so.24.1 00:04:54.510 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:04:54.510 [376/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:04:54.769 [377/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:04:54.769 [378/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:04:54.769 INFO: autodetecting backend as ninja 00:04:54.769 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:04:56.157 CC lib/ut/ut.o 00:04:56.157 CC lib/log/log.o 00:04:56.157 CC lib/log/log_flags.o 00:04:56.157 CC lib/log/log_deprecated.o 00:04:56.157 CC lib/ut_mock/mock.o 00:04:56.157 LIB libspdk_ut.a 00:04:56.157 SO libspdk_ut.so.2.0 00:04:56.157 LIB libspdk_ut_mock.a 00:04:56.158 SO libspdk_ut_mock.so.6.0 00:04:56.158 SYMLINK libspdk_ut.so 00:04:56.158 SYMLINK libspdk_ut_mock.so 00:04:56.158 LIB libspdk_log.a 00:04:56.415 SO libspdk_log.so.7.0 00:04:56.415 SYMLINK libspdk_log.so 00:04:56.674 CC lib/ioat/ioat.o 00:04:56.674 CC lib/util/bit_array.o 00:04:56.674 CC lib/util/base64.o 00:04:56.674 CC lib/util/cpuset.o 00:04:56.674 CC lib/util/crc16.o 00:04:56.674 CC lib/util/crc32.o 00:04:56.674 CC lib/util/crc32c.o 00:04:56.674 CC lib/util/crc32_ieee.o 00:04:56.674 CC lib/util/crc64.o 00:04:56.674 CC lib/util/dif.o 00:04:56.674 CC lib/dma/dma.o 00:04:56.674 CC lib/util/fd.o 00:04:56.674 CC lib/util/file.o 00:04:56.674 CC lib/util/fd_group.o 00:04:56.674 CC lib/util/iov.o 00:04:56.674 CC lib/util/hexlify.o 00:04:56.674 CC lib/util/math.o 00:04:56.674 CC lib/util/net.o 00:04:56.674 CC lib/util/pipe.o 00:04:56.674 CC lib/util/strerror_tls.o 00:04:56.674 CXX lib/trace_parser/trace.o 00:04:56.674 CC lib/util/string.o 00:04:56.674 CC lib/util/uuid.o 00:04:56.674 CC lib/util/xor.o 00:04:56.674 CC lib/util/zipf.o 00:04:56.936 CC lib/vfio_user/host/vfio_user.o 00:04:56.936 CC lib/vfio_user/host/vfio_user_pci.o 00:04:56.936 LIB libspdk_dma.a 00:04:56.936 SO libspdk_dma.so.4.0 00:04:57.251 SYMLINK libspdk_dma.so 00:04:57.251 LIB libspdk_util.a 00:04:57.251 LIB libspdk_vfio_user.a 00:04:57.251 SO libspdk_vfio_user.so.5.0 00:04:57.251 SO libspdk_util.so.10.0 00:04:57.251 LIB libspdk_ioat.a 00:04:57.251 SYMLINK libspdk_vfio_user.so 00:04:57.251 SO libspdk_ioat.so.7.0 00:04:57.510 SYMLINK libspdk_ioat.so 00:04:57.510 SYMLINK libspdk_util.so 00:04:57.770 LIB libspdk_trace_parser.a 00:04:57.770 SO libspdk_trace_parser.so.5.0 00:04:57.770 SYMLINK libspdk_trace_parser.so 00:04:58.028 CC lib/conf/conf.o 00:04:58.028 CC lib/vmd/vmd.o 00:04:58.028 CC lib/vmd/led.o 00:04:58.028 CC lib/rdma_utils/rdma_utils.o 00:04:58.028 CC lib/reduce/reduce.o 00:04:58.028 CC lib/env_dpdk/env.o 00:04:58.028 CC lib/json/json_parse.o 00:04:58.028 CC lib/env_dpdk/memory.o 00:04:58.028 CC lib/rdma_provider/common.o 00:04:58.028 CC lib/json/json_util.o 00:04:58.028 CC lib/env_dpdk/pci.o 00:04:58.028 CC lib/rdma_provider/rdma_provider_verbs.o 00:04:58.028 CC lib/json/json_write.o 00:04:58.028 CC lib/env_dpdk/init.o 00:04:58.028 CC lib/env_dpdk/threads.o 00:04:58.028 CC lib/env_dpdk/pci_ioat.o 00:04:58.028 CC lib/env_dpdk/pci_vmd.o 00:04:58.028 CC lib/env_dpdk/pci_virtio.o 00:04:58.028 CC lib/env_dpdk/pci_idxd.o 00:04:58.028 CC lib/env_dpdk/pci_event.o 00:04:58.028 CC lib/env_dpdk/sigbus_handler.o 00:04:58.028 CC lib/env_dpdk/pci_dpdk.o 00:04:58.028 CC lib/env_dpdk/pci_dpdk_2207.o 00:04:58.028 CC lib/env_dpdk/pci_dpdk_2211.o 00:04:58.028 CC lib/idxd/idxd.o 00:04:58.028 CC lib/idxd/idxd_user.o 00:04:58.028 CC lib/idxd/idxd_kernel.o 00:04:58.287 LIB libspdk_conf.a 00:04:58.287 LIB libspdk_rdma_provider.a 00:04:58.287 SO libspdk_conf.so.6.0 00:04:58.287 SO libspdk_rdma_provider.so.6.0 00:04:58.287 LIB libspdk_json.a 00:04:58.287 LIB libspdk_rdma_utils.a 00:04:58.287 SYMLINK libspdk_conf.so 00:04:58.287 SO libspdk_rdma_utils.so.1.0 00:04:58.287 SO libspdk_json.so.6.0 00:04:58.287 SYMLINK libspdk_rdma_provider.so 00:04:58.545 SYMLINK libspdk_rdma_utils.so 00:04:58.545 SYMLINK libspdk_json.so 00:04:58.545 LIB libspdk_idxd.a 00:04:58.545 LIB libspdk_vmd.a 00:04:58.545 SO libspdk_idxd.so.12.0 00:04:58.802 LIB libspdk_reduce.a 00:04:58.802 SO libspdk_vmd.so.6.0 00:04:58.802 SYMLINK libspdk_idxd.so 00:04:58.802 SO libspdk_reduce.so.6.1 00:04:58.802 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:04:58.802 CC lib/jsonrpc/jsonrpc_server.o 00:04:58.802 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:04:58.802 CC lib/jsonrpc/jsonrpc_client.o 00:04:58.802 SYMLINK libspdk_vmd.so 00:04:58.802 SYMLINK libspdk_reduce.so 00:04:59.059 LIB libspdk_jsonrpc.a 00:04:59.059 SO libspdk_jsonrpc.so.6.0 00:04:59.059 SYMLINK libspdk_jsonrpc.so 00:04:59.317 LIB libspdk_env_dpdk.a 00:04:59.575 SO libspdk_env_dpdk.so.15.0 00:04:59.575 CC lib/rpc/rpc.o 00:04:59.575 SYMLINK libspdk_env_dpdk.so 00:04:59.834 LIB libspdk_rpc.a 00:04:59.834 SO libspdk_rpc.so.6.0 00:04:59.834 SYMLINK libspdk_rpc.so 00:05:00.092 CC lib/trace/trace.o 00:05:00.092 CC lib/trace/trace_flags.o 00:05:00.092 CC lib/notify/notify.o 00:05:00.092 CC lib/trace/trace_rpc.o 00:05:00.092 CC lib/keyring/keyring.o 00:05:00.092 CC lib/notify/notify_rpc.o 00:05:00.092 CC lib/keyring/keyring_rpc.o 00:05:00.351 LIB libspdk_notify.a 00:05:00.351 SO libspdk_notify.so.6.0 00:05:00.351 LIB libspdk_keyring.a 00:05:00.351 LIB libspdk_trace.a 00:05:00.351 SO libspdk_keyring.so.1.0 00:05:00.610 SO libspdk_trace.so.10.0 00:05:00.610 SYMLINK libspdk_notify.so 00:05:00.610 SYMLINK libspdk_keyring.so 00:05:00.610 SYMLINK libspdk_trace.so 00:05:00.869 CC lib/sock/sock.o 00:05:00.869 CC lib/sock/sock_rpc.o 00:05:00.869 CC lib/thread/thread.o 00:05:00.869 CC lib/thread/iobuf.o 00:05:01.436 LIB libspdk_sock.a 00:05:01.436 SO libspdk_sock.so.10.0 00:05:01.436 SYMLINK libspdk_sock.so 00:05:01.694 CC lib/nvme/nvme_ctrlr_cmd.o 00:05:01.694 CC lib/nvme/nvme_ctrlr.o 00:05:01.694 CC lib/nvme/nvme_fabric.o 00:05:01.694 CC lib/nvme/nvme_ns_cmd.o 00:05:01.694 CC lib/nvme/nvme_ns.o 00:05:01.694 CC lib/nvme/nvme_qpair.o 00:05:01.694 CC lib/nvme/nvme_pcie_common.o 00:05:01.694 CC lib/nvme/nvme_pcie.o 00:05:01.694 CC lib/nvme/nvme.o 00:05:01.694 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:05:01.694 CC lib/nvme/nvme_quirks.o 00:05:01.694 CC lib/nvme/nvme_transport.o 00:05:01.694 CC lib/nvme/nvme_discovery.o 00:05:01.954 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:05:01.954 CC lib/nvme/nvme_tcp.o 00:05:01.954 CC lib/nvme/nvme_opal.o 00:05:01.954 CC lib/nvme/nvme_io_msg.o 00:05:01.954 CC lib/nvme/nvme_poll_group.o 00:05:01.954 CC lib/nvme/nvme_zns.o 00:05:01.954 CC lib/nvme/nvme_stubs.o 00:05:01.954 CC lib/nvme/nvme_auth.o 00:05:01.954 CC lib/nvme/nvme_cuse.o 00:05:01.954 CC lib/nvme/nvme_rdma.o 00:05:01.954 LIB libspdk_thread.a 00:05:01.954 SO libspdk_thread.so.10.1 00:05:02.213 SYMLINK libspdk_thread.so 00:05:02.473 CC lib/virtio/virtio_vhost_user.o 00:05:02.473 CC lib/virtio/virtio_vfio_user.o 00:05:02.473 CC lib/virtio/virtio.o 00:05:02.473 CC lib/virtio/virtio_pci.o 00:05:02.473 CC lib/init/json_config.o 00:05:02.473 CC lib/init/subsystem.o 00:05:02.473 CC lib/init/subsystem_rpc.o 00:05:02.473 CC lib/blob/request.o 00:05:02.473 CC lib/init/rpc.o 00:05:02.473 CC lib/blob/blobstore.o 00:05:02.473 CC lib/blob/zeroes.o 00:05:02.473 CC lib/blob/blob_bs_dev.o 00:05:02.473 CC lib/accel/accel.o 00:05:02.473 CC lib/accel/accel_sw.o 00:05:02.473 CC lib/accel/accel_rpc.o 00:05:02.731 LIB libspdk_init.a 00:05:02.731 SO libspdk_init.so.5.0 00:05:02.731 LIB libspdk_virtio.a 00:05:02.731 SO libspdk_virtio.so.7.0 00:05:02.991 SYMLINK libspdk_init.so 00:05:02.991 SYMLINK libspdk_virtio.so 00:05:03.251 CC lib/event/reactor.o 00:05:03.251 CC lib/event/app.o 00:05:03.251 CC lib/event/log_rpc.o 00:05:03.251 CC lib/event/app_rpc.o 00:05:03.251 CC lib/event/scheduler_static.o 00:05:03.510 LIB libspdk_accel.a 00:05:03.510 SO libspdk_accel.so.16.0 00:05:03.510 SYMLINK libspdk_accel.so 00:05:03.770 LIB libspdk_event.a 00:05:03.770 SO libspdk_event.so.14.0 00:05:03.770 SYMLINK libspdk_event.so 00:05:04.029 CC lib/bdev/bdev.o 00:05:04.029 CC lib/bdev/part.o 00:05:04.029 CC lib/bdev/bdev_rpc.o 00:05:04.029 CC lib/bdev/bdev_zone.o 00:05:04.029 CC lib/bdev/scsi_nvme.o 00:05:04.968 LIB libspdk_blob.a 00:05:04.968 SO libspdk_blob.so.11.0 00:05:04.968 SYMLINK libspdk_blob.so 00:05:05.227 LIB libspdk_nvme.a 00:05:05.227 CC lib/blobfs/blobfs.o 00:05:05.227 CC lib/blobfs/tree.o 00:05:05.227 SO libspdk_nvme.so.13.1 00:05:05.227 CC lib/lvol/lvol.o 00:05:05.487 SYMLINK libspdk_nvme.so 00:05:06.056 LIB libspdk_blobfs.a 00:05:06.056 SO libspdk_blobfs.so.10.0 00:05:06.315 LIB libspdk_lvol.a 00:05:06.315 SYMLINK libspdk_blobfs.so 00:05:06.315 SO libspdk_lvol.so.10.0 00:05:06.315 SYMLINK libspdk_lvol.so 00:05:08.225 LIB libspdk_bdev.a 00:05:08.225 SO libspdk_bdev.so.16.0 00:05:08.484 SYMLINK libspdk_bdev.so 00:05:08.754 CC lib/scsi/dev.o 00:05:08.754 CC lib/scsi/lun.o 00:05:08.754 CC lib/scsi/scsi.o 00:05:08.754 CC lib/scsi/port.o 00:05:08.754 CC lib/scsi/scsi_pr.o 00:05:08.754 CC lib/scsi/scsi_bdev.o 00:05:08.754 CC lib/scsi/task.o 00:05:08.754 CC lib/scsi/scsi_rpc.o 00:05:08.754 CC lib/nvmf/ctrlr_discovery.o 00:05:08.754 CC lib/nvmf/ctrlr.o 00:05:08.754 CC lib/nbd/nbd_rpc.o 00:05:08.754 CC lib/nvmf/ctrlr_bdev.o 00:05:08.754 CC lib/nbd/nbd.o 00:05:08.754 CC lib/nvmf/subsystem.o 00:05:08.754 CC lib/nvmf/nvmf.o 00:05:08.754 CC lib/ftl/ftl_core.o 00:05:08.754 CC lib/nvmf/nvmf_rpc.o 00:05:08.754 CC lib/ftl/ftl_init.o 00:05:08.754 CC lib/nvmf/transport.o 00:05:08.754 CC lib/ublk/ublk.o 00:05:08.754 CC lib/ftl/ftl_layout.o 00:05:08.754 CC lib/ublk/ublk_rpc.o 00:05:08.754 CC lib/nvmf/tcp.o 00:05:08.754 CC lib/ftl/ftl_debug.o 00:05:08.754 CC lib/nvmf/stubs.o 00:05:08.754 CC lib/ftl/ftl_io.o 00:05:08.754 CC lib/ftl/ftl_l2p.o 00:05:08.754 CC lib/nvmf/mdns_server.o 00:05:08.754 CC lib/ftl/ftl_sb.o 00:05:08.754 CC lib/nvmf/rdma.o 00:05:08.754 CC lib/ftl/ftl_l2p_flat.o 00:05:08.754 CC lib/nvmf/auth.o 00:05:08.754 CC lib/ftl/ftl_nv_cache.o 00:05:08.754 CC lib/ftl/ftl_band.o 00:05:08.754 CC lib/ftl/ftl_band_ops.o 00:05:08.754 CC lib/ftl/ftl_writer.o 00:05:08.754 CC lib/ftl/ftl_rq.o 00:05:08.754 CC lib/ftl/ftl_reloc.o 00:05:08.754 CC lib/ftl/ftl_p2l.o 00:05:08.754 CC lib/ftl/ftl_l2p_cache.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_startup.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_md.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_misc.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_band.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:05:08.754 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:05:08.754 CC lib/ftl/utils/ftl_md.o 00:05:08.754 CC lib/ftl/utils/ftl_conf.o 00:05:08.754 CC lib/ftl/utils/ftl_bitmap.o 00:05:08.754 CC lib/ftl/utils/ftl_mempool.o 00:05:08.754 CC lib/ftl/utils/ftl_property.o 00:05:08.754 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:05:08.754 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:05:08.754 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:05:08.754 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:05:08.754 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:05:08.754 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:05:08.754 CC lib/ftl/upgrade/ftl_sb_v5.o 00:05:08.754 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:05:08.754 CC lib/ftl/upgrade/ftl_sb_v3.o 00:05:08.754 CC lib/ftl/nvc/ftl_nvc_dev.o 00:05:08.754 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:05:08.754 CC lib/ftl/ftl_trace.o 00:05:08.754 CC lib/ftl/base/ftl_base_dev.o 00:05:08.754 CC lib/ftl/base/ftl_base_bdev.o 00:05:09.322 LIB libspdk_nbd.a 00:05:09.322 SO libspdk_nbd.so.7.0 00:05:09.322 LIB libspdk_scsi.a 00:05:09.580 SYMLINK libspdk_nbd.so 00:05:09.580 SO libspdk_scsi.so.9.0 00:05:09.580 LIB libspdk_ublk.a 00:05:09.580 SYMLINK libspdk_scsi.so 00:05:09.580 SO libspdk_ublk.so.3.0 00:05:09.840 SYMLINK libspdk_ublk.so 00:05:09.840 CC lib/vhost/vhost.o 00:05:09.840 CC lib/vhost/vhost_rpc.o 00:05:09.840 CC lib/iscsi/conn.o 00:05:09.840 CC lib/iscsi/md5.o 00:05:09.840 CC lib/iscsi/init_grp.o 00:05:09.840 CC lib/iscsi/iscsi.o 00:05:09.840 CC lib/vhost/vhost_scsi.o 00:05:09.840 CC lib/iscsi/param.o 00:05:09.840 CC lib/vhost/vhost_blk.o 00:05:09.840 CC lib/vhost/rte_vhost_user.o 00:05:09.840 CC lib/iscsi/portal_grp.o 00:05:09.840 CC lib/iscsi/tgt_node.o 00:05:09.840 CC lib/iscsi/iscsi_subsystem.o 00:05:10.098 CC lib/iscsi/iscsi_rpc.o 00:05:10.098 CC lib/iscsi/task.o 00:05:10.098 LIB libspdk_ftl.a 00:05:10.098 SO libspdk_ftl.so.9.0 00:05:10.356 SYMLINK libspdk_ftl.so 00:05:11.292 LIB libspdk_vhost.a 00:05:11.292 SO libspdk_vhost.so.8.0 00:05:11.292 SYMLINK libspdk_vhost.so 00:05:11.292 LIB libspdk_iscsi.a 00:05:11.292 SO libspdk_iscsi.so.8.0 00:05:11.550 SYMLINK libspdk_iscsi.so 00:05:12.486 LIB libspdk_nvmf.a 00:05:12.486 SO libspdk_nvmf.so.19.0 00:05:12.486 SYMLINK libspdk_nvmf.so 00:05:13.054 CC module/env_dpdk/env_dpdk_rpc.o 00:05:13.313 CC module/scheduler/dynamic/scheduler_dynamic.o 00:05:13.313 CC module/blob/bdev/blob_bdev.o 00:05:13.313 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:05:13.313 CC module/keyring/linux/keyring.o 00:05:13.313 CC module/accel/ioat/accel_ioat_rpc.o 00:05:13.313 CC module/accel/ioat/accel_ioat.o 00:05:13.313 CC module/keyring/file/keyring.o 00:05:13.313 CC module/keyring/linux/keyring_rpc.o 00:05:13.313 CC module/keyring/file/keyring_rpc.o 00:05:13.313 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:05:13.313 CC module/accel/iaa/accel_iaa.o 00:05:13.313 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:05:13.313 CC module/accel/iaa/accel_iaa_rpc.o 00:05:13.313 LIB libspdk_env_dpdk_rpc.a 00:05:13.313 CC module/sock/posix/posix.o 00:05:13.313 CC module/accel/error/accel_error.o 00:05:13.313 CC module/accel/error/accel_error_rpc.o 00:05:13.313 CC module/scheduler/gscheduler/gscheduler.o 00:05:13.313 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:05:13.313 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:05:13.313 CC module/accel/dsa/accel_dsa_rpc.o 00:05:13.313 CC module/accel/dsa/accel_dsa.o 00:05:13.313 SO libspdk_env_dpdk_rpc.so.6.0 00:05:13.313 SYMLINK libspdk_env_dpdk_rpc.so 00:05:13.573 LIB libspdk_keyring_linux.a 00:05:13.573 LIB libspdk_keyring_file.a 00:05:13.573 LIB libspdk_scheduler_dpdk_governor.a 00:05:13.573 LIB libspdk_scheduler_gscheduler.a 00:05:13.573 LIB libspdk_scheduler_dynamic.a 00:05:13.573 SO libspdk_keyring_linux.so.1.0 00:05:13.573 SO libspdk_scheduler_dpdk_governor.so.4.0 00:05:13.573 SO libspdk_keyring_file.so.1.0 00:05:13.573 LIB libspdk_accel_ioat.a 00:05:13.573 SO libspdk_scheduler_gscheduler.so.4.0 00:05:13.573 LIB libspdk_accel_error.a 00:05:13.573 SO libspdk_scheduler_dynamic.so.4.0 00:05:13.573 LIB libspdk_accel_iaa.a 00:05:13.573 SO libspdk_accel_ioat.so.6.0 00:05:13.573 SO libspdk_accel_error.so.2.0 00:05:13.573 SYMLINK libspdk_keyring_linux.so 00:05:13.573 SYMLINK libspdk_scheduler_dpdk_governor.so 00:05:13.573 SO libspdk_accel_iaa.so.3.0 00:05:13.573 SYMLINK libspdk_keyring_file.so 00:05:13.573 SYMLINK libspdk_scheduler_dynamic.so 00:05:13.573 SYMLINK libspdk_scheduler_gscheduler.so 00:05:13.573 LIB libspdk_accel_dsa.a 00:05:13.573 SYMLINK libspdk_accel_ioat.so 00:05:13.573 SYMLINK libspdk_accel_error.so 00:05:13.573 SYMLINK libspdk_accel_iaa.so 00:05:13.573 SO libspdk_accel_dsa.so.5.0 00:05:13.832 SYMLINK libspdk_accel_dsa.so 00:05:13.832 LIB libspdk_blob_bdev.a 00:05:13.832 SO libspdk_blob_bdev.so.11.0 00:05:14.090 SYMLINK libspdk_blob_bdev.so 00:05:14.090 LIB libspdk_sock_posix.a 00:05:14.090 SO libspdk_sock_posix.so.6.0 00:05:14.090 LIB libspdk_accel_dpdk_compressdev.a 00:05:14.090 SO libspdk_accel_dpdk_compressdev.so.3.0 00:05:14.090 SYMLINK libspdk_sock_posix.so 00:05:14.404 SYMLINK libspdk_accel_dpdk_compressdev.so 00:05:14.404 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:05:14.404 CC module/blobfs/bdev/blobfs_bdev.o 00:05:14.404 CC module/bdev/delay/vbdev_delay.o 00:05:14.404 CC module/bdev/delay/vbdev_delay_rpc.o 00:05:14.404 CC module/bdev/gpt/gpt.o 00:05:14.404 CC module/bdev/malloc/bdev_malloc_rpc.o 00:05:14.404 CC module/bdev/malloc/bdev_malloc.o 00:05:14.404 CC module/bdev/aio/bdev_aio.o 00:05:14.404 CC module/bdev/gpt/vbdev_gpt.o 00:05:14.404 CC module/bdev/ftl/bdev_ftl_rpc.o 00:05:14.404 CC module/bdev/zone_block/vbdev_zone_block.o 00:05:14.404 CC module/bdev/ftl/bdev_ftl.o 00:05:14.404 CC module/bdev/aio/bdev_aio_rpc.o 00:05:14.404 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:05:14.404 CC module/bdev/null/bdev_null.o 00:05:14.404 CC module/bdev/null/bdev_null_rpc.o 00:05:14.404 CC module/bdev/crypto/vbdev_crypto.o 00:05:14.404 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:05:14.404 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:05:14.404 CC module/bdev/iscsi/bdev_iscsi.o 00:05:14.404 CC module/bdev/passthru/vbdev_passthru.o 00:05:14.404 CC module/bdev/compress/vbdev_compress_rpc.o 00:05:14.404 CC module/bdev/compress/vbdev_compress.o 00:05:14.404 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:05:14.404 CC module/bdev/split/vbdev_split.o 00:05:14.404 CC module/bdev/lvol/vbdev_lvol.o 00:05:14.404 CC module/bdev/split/vbdev_split_rpc.o 00:05:14.404 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:05:14.404 CC module/bdev/nvme/bdev_nvme.o 00:05:14.404 CC module/bdev/nvme/bdev_nvme_rpc.o 00:05:14.404 CC module/bdev/nvme/nvme_rpc.o 00:05:14.404 CC module/bdev/error/vbdev_error.o 00:05:14.404 CC module/bdev/raid/bdev_raid.o 00:05:14.404 CC module/bdev/error/vbdev_error_rpc.o 00:05:14.404 CC module/bdev/raid/bdev_raid_rpc.o 00:05:14.404 CC module/bdev/nvme/bdev_mdns_client.o 00:05:14.404 CC module/bdev/nvme/vbdev_opal.o 00:05:14.404 CC module/bdev/raid/raid0.o 00:05:14.404 CC module/bdev/raid/bdev_raid_sb.o 00:05:14.405 CC module/bdev/raid/raid1.o 00:05:14.405 CC module/bdev/nvme/vbdev_opal_rpc.o 00:05:14.405 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:05:14.405 CC module/bdev/raid/concat.o 00:05:14.405 CC module/bdev/virtio/bdev_virtio_scsi.o 00:05:14.405 CC module/bdev/virtio/bdev_virtio_blk.o 00:05:14.405 CC module/bdev/virtio/bdev_virtio_rpc.o 00:05:14.684 LIB libspdk_blobfs_bdev.a 00:05:14.684 SO libspdk_blobfs_bdev.so.6.0 00:05:14.684 LIB libspdk_bdev_split.a 00:05:14.684 LIB libspdk_bdev_error.a 00:05:14.684 LIB libspdk_bdev_malloc.a 00:05:14.684 LIB libspdk_bdev_null.a 00:05:14.684 LIB libspdk_bdev_gpt.a 00:05:14.942 LIB libspdk_bdev_ftl.a 00:05:14.942 SO libspdk_bdev_error.so.6.0 00:05:14.942 SO libspdk_bdev_split.so.6.0 00:05:14.942 SO libspdk_bdev_malloc.so.6.0 00:05:14.942 SO libspdk_bdev_null.so.6.0 00:05:14.942 SO libspdk_bdev_gpt.so.6.0 00:05:14.942 LIB libspdk_accel_dpdk_cryptodev.a 00:05:14.942 LIB libspdk_bdev_passthru.a 00:05:14.942 SYMLINK libspdk_blobfs_bdev.so 00:05:14.942 SO libspdk_bdev_ftl.so.6.0 00:05:14.942 SO libspdk_bdev_passthru.so.6.0 00:05:14.942 LIB libspdk_bdev_aio.a 00:05:14.942 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:05:14.942 SYMLINK libspdk_bdev_error.so 00:05:14.942 SYMLINK libspdk_bdev_null.so 00:05:14.942 SYMLINK libspdk_bdev_split.so 00:05:14.942 LIB libspdk_bdev_delay.a 00:05:14.942 SYMLINK libspdk_bdev_malloc.so 00:05:14.942 SYMLINK libspdk_bdev_gpt.so 00:05:14.942 SO libspdk_bdev_aio.so.6.0 00:05:14.942 SYMLINK libspdk_bdev_ftl.so 00:05:14.942 SYMLINK libspdk_bdev_passthru.so 00:05:14.942 LIB libspdk_bdev_zone_block.a 00:05:14.942 SO libspdk_bdev_delay.so.6.0 00:05:14.942 LIB libspdk_bdev_iscsi.a 00:05:14.942 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:05:14.942 LIB libspdk_bdev_crypto.a 00:05:14.942 SO libspdk_bdev_zone_block.so.6.0 00:05:14.942 LIB libspdk_bdev_compress.a 00:05:14.942 SO libspdk_bdev_iscsi.so.6.0 00:05:14.942 SYMLINK libspdk_bdev_aio.so 00:05:14.942 SO libspdk_bdev_crypto.so.6.0 00:05:14.942 SYMLINK libspdk_bdev_delay.so 00:05:14.942 SO libspdk_bdev_compress.so.6.0 00:05:15.199 SYMLINK libspdk_bdev_zone_block.so 00:05:15.199 SYMLINK libspdk_bdev_iscsi.so 00:05:15.199 SYMLINK libspdk_bdev_crypto.so 00:05:15.199 SYMLINK libspdk_bdev_compress.so 00:05:15.199 LIB libspdk_bdev_lvol.a 00:05:15.199 SO libspdk_bdev_lvol.so.6.0 00:05:15.199 SYMLINK libspdk_bdev_lvol.so 00:05:15.534 LIB libspdk_bdev_raid.a 00:05:15.534 SO libspdk_bdev_raid.so.6.0 00:05:15.793 LIB libspdk_bdev_virtio.a 00:05:15.793 SYMLINK libspdk_bdev_raid.so 00:05:15.793 SO libspdk_bdev_virtio.so.6.0 00:05:15.793 SYMLINK libspdk_bdev_virtio.so 00:05:16.728 LIB libspdk_bdev_nvme.a 00:05:16.986 SO libspdk_bdev_nvme.so.7.0 00:05:16.986 SYMLINK libspdk_bdev_nvme.so 00:05:17.921 CC module/event/subsystems/sock/sock.o 00:05:17.921 CC module/event/subsystems/iobuf/iobuf.o 00:05:17.921 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:05:17.921 CC module/event/subsystems/keyring/keyring.o 00:05:17.921 CC module/event/subsystems/scheduler/scheduler.o 00:05:17.921 CC module/event/subsystems/vmd/vmd_rpc.o 00:05:17.921 CC module/event/subsystems/vmd/vmd.o 00:05:17.921 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:05:17.921 LIB libspdk_event_keyring.a 00:05:17.921 LIB libspdk_event_sock.a 00:05:17.921 LIB libspdk_event_scheduler.a 00:05:17.921 SO libspdk_event_keyring.so.1.0 00:05:17.921 LIB libspdk_event_iobuf.a 00:05:17.921 LIB libspdk_event_vmd.a 00:05:17.921 LIB libspdk_event_vhost_blk.a 00:05:17.921 SO libspdk_event_sock.so.5.0 00:05:17.921 SO libspdk_event_scheduler.so.4.0 00:05:17.921 SO libspdk_event_vmd.so.6.0 00:05:17.921 SO libspdk_event_iobuf.so.3.0 00:05:17.921 SO libspdk_event_vhost_blk.so.3.0 00:05:17.921 SYMLINK libspdk_event_keyring.so 00:05:17.921 SYMLINK libspdk_event_sock.so 00:05:17.921 SYMLINK libspdk_event_vmd.so 00:05:17.921 SYMLINK libspdk_event_vhost_blk.so 00:05:17.921 SYMLINK libspdk_event_iobuf.so 00:05:18.180 SYMLINK libspdk_event_scheduler.so 00:05:18.438 CC module/event/subsystems/accel/accel.o 00:05:18.438 LIB libspdk_event_accel.a 00:05:18.696 SO libspdk_event_accel.so.6.0 00:05:18.696 SYMLINK libspdk_event_accel.so 00:05:18.954 CC module/event/subsystems/bdev/bdev.o 00:05:19.213 LIB libspdk_event_bdev.a 00:05:19.213 SO libspdk_event_bdev.so.6.0 00:05:19.213 SYMLINK libspdk_event_bdev.so 00:05:19.779 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:05:19.779 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:05:19.779 CC module/event/subsystems/scsi/scsi.o 00:05:19.779 CC module/event/subsystems/nbd/nbd.o 00:05:19.779 CC module/event/subsystems/ublk/ublk.o 00:05:19.779 LIB libspdk_event_nbd.a 00:05:19.779 LIB libspdk_event_ublk.a 00:05:19.779 LIB libspdk_event_scsi.a 00:05:19.779 SO libspdk_event_nbd.so.6.0 00:05:19.779 SO libspdk_event_ublk.so.3.0 00:05:19.779 SO libspdk_event_scsi.so.6.0 00:05:20.038 LIB libspdk_event_nvmf.a 00:05:20.038 SYMLINK libspdk_event_nbd.so 00:05:20.038 SYMLINK libspdk_event_ublk.so 00:05:20.038 SYMLINK libspdk_event_scsi.so 00:05:20.038 SO libspdk_event_nvmf.so.6.0 00:05:20.038 SYMLINK libspdk_event_nvmf.so 00:05:20.296 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:05:20.296 CC module/event/subsystems/iscsi/iscsi.o 00:05:20.555 LIB libspdk_event_vhost_scsi.a 00:05:20.555 SO libspdk_event_vhost_scsi.so.3.0 00:05:20.555 LIB libspdk_event_iscsi.a 00:05:20.555 SO libspdk_event_iscsi.so.6.0 00:05:20.555 SYMLINK libspdk_event_vhost_scsi.so 00:05:20.555 SYMLINK libspdk_event_iscsi.so 00:05:20.813 SO libspdk.so.6.0 00:05:20.813 SYMLINK libspdk.so 00:05:21.383 CC app/trace_record/trace_record.o 00:05:21.383 CC test/rpc_client/rpc_client_test.o 00:05:21.383 TEST_HEADER include/spdk/accel.h 00:05:21.383 TEST_HEADER include/spdk/accel_module.h 00:05:21.383 CXX app/trace/trace.o 00:05:21.383 TEST_HEADER include/spdk/base64.h 00:05:21.383 TEST_HEADER include/spdk/barrier.h 00:05:21.383 TEST_HEADER include/spdk/assert.h 00:05:21.383 TEST_HEADER include/spdk/bdev_module.h 00:05:21.383 TEST_HEADER include/spdk/bdev.h 00:05:21.383 CC app/spdk_nvme_perf/perf.o 00:05:21.383 TEST_HEADER include/spdk/bdev_zone.h 00:05:21.383 CC app/spdk_lspci/spdk_lspci.o 00:05:21.383 TEST_HEADER include/spdk/bit_array.h 00:05:21.383 TEST_HEADER include/spdk/bit_pool.h 00:05:21.383 TEST_HEADER include/spdk/blob_bdev.h 00:05:21.383 TEST_HEADER include/spdk/blobfs_bdev.h 00:05:21.383 TEST_HEADER include/spdk/blobfs.h 00:05:21.383 TEST_HEADER include/spdk/conf.h 00:05:21.383 CC app/spdk_top/spdk_top.o 00:05:21.383 TEST_HEADER include/spdk/blob.h 00:05:21.383 TEST_HEADER include/spdk/config.h 00:05:21.383 TEST_HEADER include/spdk/cpuset.h 00:05:21.383 CC app/spdk_nvme_identify/identify.o 00:05:21.383 TEST_HEADER include/spdk/crc16.h 00:05:21.383 TEST_HEADER include/spdk/crc64.h 00:05:21.383 TEST_HEADER include/spdk/crc32.h 00:05:21.383 TEST_HEADER include/spdk/dif.h 00:05:21.383 TEST_HEADER include/spdk/dma.h 00:05:21.383 TEST_HEADER include/spdk/endian.h 00:05:21.383 CC app/spdk_nvme_discover/discovery_aer.o 00:05:21.383 TEST_HEADER include/spdk/env_dpdk.h 00:05:21.383 TEST_HEADER include/spdk/env.h 00:05:21.383 TEST_HEADER include/spdk/event.h 00:05:21.383 TEST_HEADER include/spdk/fd_group.h 00:05:21.383 TEST_HEADER include/spdk/fd.h 00:05:21.383 TEST_HEADER include/spdk/file.h 00:05:21.383 TEST_HEADER include/spdk/ftl.h 00:05:21.383 TEST_HEADER include/spdk/gpt_spec.h 00:05:21.383 TEST_HEADER include/spdk/hexlify.h 00:05:21.383 TEST_HEADER include/spdk/histogram_data.h 00:05:21.383 TEST_HEADER include/spdk/idxd.h 00:05:21.383 TEST_HEADER include/spdk/idxd_spec.h 00:05:21.384 TEST_HEADER include/spdk/init.h 00:05:21.384 TEST_HEADER include/spdk/ioat.h 00:05:21.384 TEST_HEADER include/spdk/ioat_spec.h 00:05:21.384 TEST_HEADER include/spdk/iscsi_spec.h 00:05:21.384 TEST_HEADER include/spdk/json.h 00:05:21.384 TEST_HEADER include/spdk/jsonrpc.h 00:05:21.384 TEST_HEADER include/spdk/keyring.h 00:05:21.384 TEST_HEADER include/spdk/keyring_module.h 00:05:21.384 TEST_HEADER include/spdk/likely.h 00:05:21.384 TEST_HEADER include/spdk/log.h 00:05:21.384 TEST_HEADER include/spdk/lvol.h 00:05:21.384 TEST_HEADER include/spdk/memory.h 00:05:21.384 TEST_HEADER include/spdk/nbd.h 00:05:21.384 TEST_HEADER include/spdk/mmio.h 00:05:21.384 TEST_HEADER include/spdk/net.h 00:05:21.384 TEST_HEADER include/spdk/notify.h 00:05:21.384 TEST_HEADER include/spdk/nvme.h 00:05:21.384 TEST_HEADER include/spdk/nvme_intel.h 00:05:21.384 TEST_HEADER include/spdk/nvme_ocssd.h 00:05:21.384 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:05:21.384 TEST_HEADER include/spdk/nvme_spec.h 00:05:21.384 TEST_HEADER include/spdk/nvme_zns.h 00:05:21.384 TEST_HEADER include/spdk/nvmf_cmd.h 00:05:21.384 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:05:21.384 TEST_HEADER include/spdk/nvmf.h 00:05:21.384 TEST_HEADER include/spdk/nvmf_spec.h 00:05:21.384 TEST_HEADER include/spdk/nvmf_transport.h 00:05:21.384 TEST_HEADER include/spdk/opal.h 00:05:21.384 TEST_HEADER include/spdk/opal_spec.h 00:05:21.384 TEST_HEADER include/spdk/pipe.h 00:05:21.384 TEST_HEADER include/spdk/pci_ids.h 00:05:21.384 TEST_HEADER include/spdk/queue.h 00:05:21.384 TEST_HEADER include/spdk/rpc.h 00:05:21.384 TEST_HEADER include/spdk/reduce.h 00:05:21.384 TEST_HEADER include/spdk/scheduler.h 00:05:21.384 TEST_HEADER include/spdk/scsi.h 00:05:21.384 TEST_HEADER include/spdk/scsi_spec.h 00:05:21.384 TEST_HEADER include/spdk/sock.h 00:05:21.384 TEST_HEADER include/spdk/stdinc.h 00:05:21.384 CC app/spdk_dd/spdk_dd.o 00:05:21.384 TEST_HEADER include/spdk/string.h 00:05:21.384 TEST_HEADER include/spdk/thread.h 00:05:21.384 TEST_HEADER include/spdk/trace.h 00:05:21.384 TEST_HEADER include/spdk/trace_parser.h 00:05:21.384 CC examples/interrupt_tgt/interrupt_tgt.o 00:05:21.384 TEST_HEADER include/spdk/ublk.h 00:05:21.384 TEST_HEADER include/spdk/tree.h 00:05:21.384 TEST_HEADER include/spdk/util.h 00:05:21.384 TEST_HEADER include/spdk/uuid.h 00:05:21.384 TEST_HEADER include/spdk/version.h 00:05:21.384 TEST_HEADER include/spdk/vfio_user_pci.h 00:05:21.384 TEST_HEADER include/spdk/vfio_user_spec.h 00:05:21.384 TEST_HEADER include/spdk/vhost.h 00:05:21.384 TEST_HEADER include/spdk/vmd.h 00:05:21.384 TEST_HEADER include/spdk/zipf.h 00:05:21.384 CXX test/cpp_headers/accel.o 00:05:21.384 TEST_HEADER include/spdk/xor.h 00:05:21.384 CXX test/cpp_headers/accel_module.o 00:05:21.384 CXX test/cpp_headers/assert.o 00:05:21.384 CXX test/cpp_headers/barrier.o 00:05:21.384 CXX test/cpp_headers/base64.o 00:05:21.384 CXX test/cpp_headers/bdev.o 00:05:21.384 CXX test/cpp_headers/bdev_module.o 00:05:21.384 CXX test/cpp_headers/bdev_zone.o 00:05:21.384 CXX test/cpp_headers/bit_array.o 00:05:21.384 CC app/nvmf_tgt/nvmf_main.o 00:05:21.384 CXX test/cpp_headers/blobfs_bdev.o 00:05:21.384 CXX test/cpp_headers/blob_bdev.o 00:05:21.384 CXX test/cpp_headers/bit_pool.o 00:05:21.384 CC app/iscsi_tgt/iscsi_tgt.o 00:05:21.384 CXX test/cpp_headers/blob.o 00:05:21.384 CXX test/cpp_headers/blobfs.o 00:05:21.384 CXX test/cpp_headers/conf.o 00:05:21.384 CXX test/cpp_headers/config.o 00:05:21.384 CXX test/cpp_headers/cpuset.o 00:05:21.384 CXX test/cpp_headers/crc16.o 00:05:21.384 CXX test/cpp_headers/crc32.o 00:05:21.384 CXX test/cpp_headers/crc64.o 00:05:21.384 CXX test/cpp_headers/dif.o 00:05:21.384 CXX test/cpp_headers/dma.o 00:05:21.384 CXX test/cpp_headers/env_dpdk.o 00:05:21.384 CXX test/cpp_headers/endian.o 00:05:21.384 CXX test/cpp_headers/env.o 00:05:21.384 CXX test/cpp_headers/fd_group.o 00:05:21.384 CXX test/cpp_headers/event.o 00:05:21.384 CXX test/cpp_headers/file.o 00:05:21.384 CXX test/cpp_headers/fd.o 00:05:21.384 CXX test/cpp_headers/ftl.o 00:05:21.384 CXX test/cpp_headers/hexlify.o 00:05:21.384 CXX test/cpp_headers/gpt_spec.o 00:05:21.384 CXX test/cpp_headers/histogram_data.o 00:05:21.384 CXX test/cpp_headers/idxd.o 00:05:21.384 CXX test/cpp_headers/init.o 00:05:21.384 CXX test/cpp_headers/ioat.o 00:05:21.384 CXX test/cpp_headers/iscsi_spec.o 00:05:21.384 CXX test/cpp_headers/ioat_spec.o 00:05:21.384 CXX test/cpp_headers/idxd_spec.o 00:05:21.384 CXX test/cpp_headers/json.o 00:05:21.384 CXX test/cpp_headers/jsonrpc.o 00:05:21.384 CXX test/cpp_headers/keyring.o 00:05:21.384 CC app/spdk_tgt/spdk_tgt.o 00:05:21.384 CC test/thread/poller_perf/poller_perf.o 00:05:21.384 CC test/env/pci/pci_ut.o 00:05:21.384 CC test/env/vtophys/vtophys.o 00:05:21.384 CXX test/cpp_headers/keyring_module.o 00:05:21.384 CC test/app/histogram_perf/histogram_perf.o 00:05:21.384 CC examples/util/zipf/zipf.o 00:05:21.384 CC test/env/memory/memory_ut.o 00:05:21.384 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:05:21.384 CC test/app/jsoncat/jsoncat.o 00:05:21.384 CC examples/ioat/verify/verify.o 00:05:21.384 CC examples/ioat/perf/perf.o 00:05:21.384 CC app/fio/nvme/fio_plugin.o 00:05:21.384 CC test/app/stub/stub.o 00:05:21.384 CC test/app/bdev_svc/bdev_svc.o 00:05:21.652 CC test/dma/test_dma/test_dma.o 00:05:21.652 CC app/fio/bdev/fio_plugin.o 00:05:21.652 CC test/env/mem_callbacks/mem_callbacks.o 00:05:21.652 LINK spdk_lspci 00:05:21.652 LINK spdk_nvme_discover 00:05:21.652 LINK rpc_client_test 00:05:21.652 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:05:21.923 LINK histogram_perf 00:05:21.923 LINK vtophys 00:05:21.923 LINK interrupt_tgt 00:05:21.923 LINK jsoncat 00:05:21.923 LINK stub 00:05:21.923 CXX test/cpp_headers/likely.o 00:05:21.923 CXX test/cpp_headers/log.o 00:05:21.923 CXX test/cpp_headers/lvol.o 00:05:21.923 LINK poller_perf 00:05:21.923 LINK iscsi_tgt 00:05:21.923 LINK spdk_trace_record 00:05:21.923 CXX test/cpp_headers/memory.o 00:05:21.923 LINK zipf 00:05:21.923 CXX test/cpp_headers/mmio.o 00:05:21.923 CXX test/cpp_headers/nbd.o 00:05:21.923 CXX test/cpp_headers/net.o 00:05:21.923 CXX test/cpp_headers/notify.o 00:05:21.923 CXX test/cpp_headers/nvme.o 00:05:21.923 CXX test/cpp_headers/nvme_intel.o 00:05:21.923 LINK env_dpdk_post_init 00:05:21.923 LINK nvmf_tgt 00:05:21.923 CXX test/cpp_headers/nvme_ocssd_spec.o 00:05:21.923 CXX test/cpp_headers/nvme_ocssd.o 00:05:21.923 CXX test/cpp_headers/nvme_spec.o 00:05:21.923 CXX test/cpp_headers/nvme_zns.o 00:05:21.923 CXX test/cpp_headers/nvmf_cmd.o 00:05:21.923 CXX test/cpp_headers/nvmf_fc_spec.o 00:05:21.923 CXX test/cpp_headers/nvmf.o 00:05:21.923 CXX test/cpp_headers/nvmf_spec.o 00:05:21.923 CXX test/cpp_headers/nvmf_transport.o 00:05:21.923 LINK spdk_trace 00:05:21.923 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:05:21.923 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:05:21.923 CXX test/cpp_headers/opal.o 00:05:21.923 CXX test/cpp_headers/opal_spec.o 00:05:21.923 CXX test/cpp_headers/pci_ids.o 00:05:21.923 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:05:21.923 LINK spdk_tgt 00:05:21.923 LINK verify 00:05:21.923 CXX test/cpp_headers/pipe.o 00:05:21.923 CXX test/cpp_headers/queue.o 00:05:21.923 CXX test/cpp_headers/reduce.o 00:05:21.923 CXX test/cpp_headers/rpc.o 00:05:21.923 CXX test/cpp_headers/scheduler.o 00:05:21.923 LINK ioat_perf 00:05:21.923 CXX test/cpp_headers/scsi.o 00:05:21.923 CXX test/cpp_headers/scsi_spec.o 00:05:21.923 CXX test/cpp_headers/sock.o 00:05:21.923 CXX test/cpp_headers/stdinc.o 00:05:21.923 CXX test/cpp_headers/string.o 00:05:21.923 CXX test/cpp_headers/thread.o 00:05:21.923 CXX test/cpp_headers/trace.o 00:05:21.923 LINK bdev_svc 00:05:21.923 CXX test/cpp_headers/trace_parser.o 00:05:21.923 CXX test/cpp_headers/tree.o 00:05:21.923 CXX test/cpp_headers/ublk.o 00:05:21.923 CXX test/cpp_headers/util.o 00:05:22.185 CXX test/cpp_headers/uuid.o 00:05:22.185 CXX test/cpp_headers/vfio_user_pci.o 00:05:22.185 CXX test/cpp_headers/version.o 00:05:22.185 CXX test/cpp_headers/vfio_user_spec.o 00:05:22.185 CXX test/cpp_headers/vhost.o 00:05:22.185 CXX test/cpp_headers/vmd.o 00:05:22.185 CXX test/cpp_headers/xor.o 00:05:22.185 CXX test/cpp_headers/zipf.o 00:05:22.185 LINK spdk_dd 00:05:22.444 LINK pci_ut 00:05:22.444 LINK test_dma 00:05:22.444 LINK nvme_fuzz 00:05:22.444 CC test/event/event_perf/event_perf.o 00:05:22.444 CC test/event/reactor/reactor.o 00:05:22.444 CC test/event/reactor_perf/reactor_perf.o 00:05:22.702 CC test/event/app_repeat/app_repeat.o 00:05:22.702 CC examples/vmd/lsvmd/lsvmd.o 00:05:22.702 CC examples/idxd/perf/perf.o 00:05:22.702 LINK spdk_bdev 00:05:22.702 CC examples/vmd/led/led.o 00:05:22.702 CC examples/sock/hello_world/hello_sock.o 00:05:22.702 CC app/vhost/vhost.o 00:05:22.702 CC examples/thread/thread/thread_ex.o 00:05:22.702 LINK spdk_nvme 00:05:22.702 CC test/event/scheduler/scheduler.o 00:05:22.702 LINK reactor_perf 00:05:22.702 LINK reactor 00:05:22.702 LINK event_perf 00:05:22.703 LINK lsvmd 00:05:22.703 LINK mem_callbacks 00:05:22.703 LINK spdk_nvme_perf 00:05:22.703 LINK led 00:05:22.703 LINK app_repeat 00:05:22.703 LINK vhost_fuzz 00:05:22.703 LINK spdk_nvme_identify 00:05:22.962 LINK hello_sock 00:05:22.962 LINK vhost 00:05:22.962 LINK scheduler 00:05:22.962 LINK spdk_top 00:05:22.962 LINK thread 00:05:22.962 LINK idxd_perf 00:05:22.962 CC test/nvme/overhead/overhead.o 00:05:22.962 CC test/nvme/reset/reset.o 00:05:22.962 CC test/nvme/compliance/nvme_compliance.o 00:05:22.962 CC test/nvme/fused_ordering/fused_ordering.o 00:05:22.962 CC test/nvme/cuse/cuse.o 00:05:22.962 CC test/nvme/err_injection/err_injection.o 00:05:22.962 CC test/nvme/simple_copy/simple_copy.o 00:05:22.962 CC test/nvme/sgl/sgl.o 00:05:22.962 CC test/nvme/startup/startup.o 00:05:22.962 CC test/nvme/doorbell_aers/doorbell_aers.o 00:05:22.962 CC test/nvme/reserve/reserve.o 00:05:22.962 CC test/nvme/e2edp/nvme_dp.o 00:05:22.962 CC test/nvme/aer/aer.o 00:05:22.962 CC test/nvme/boot_partition/boot_partition.o 00:05:22.962 CC test/nvme/connect_stress/connect_stress.o 00:05:22.962 CC test/nvme/fdp/fdp.o 00:05:23.220 CC test/accel/dif/dif.o 00:05:23.220 CC test/blobfs/mkfs/mkfs.o 00:05:23.220 CC test/lvol/esnap/esnap.o 00:05:23.220 LINK doorbell_aers 00:05:23.220 LINK memory_ut 00:05:23.220 LINK startup 00:05:23.220 LINK boot_partition 00:05:23.220 LINK reserve 00:05:23.220 LINK err_injection 00:05:23.220 LINK connect_stress 00:05:23.220 LINK fused_ordering 00:05:23.220 LINK simple_copy 00:05:23.477 CC examples/nvme/nvme_manage/nvme_manage.o 00:05:23.477 CC examples/nvme/abort/abort.o 00:05:23.477 CC examples/nvme/hotplug/hotplug.o 00:05:23.477 CC examples/nvme/hello_world/hello_world.o 00:05:23.477 CC examples/nvme/arbitration/arbitration.o 00:05:23.477 CC examples/nvme/reconnect/reconnect.o 00:05:23.477 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:05:23.477 CC examples/nvme/cmb_copy/cmb_copy.o 00:05:23.477 LINK sgl 00:05:23.477 LINK reset 00:05:23.477 LINK mkfs 00:05:23.477 LINK fdp 00:05:23.477 LINK overhead 00:05:23.477 LINK nvme_dp 00:05:23.477 LINK aer 00:05:23.477 LINK nvme_compliance 00:05:23.477 CC examples/accel/perf/accel_perf.o 00:05:23.478 CC examples/blob/cli/blobcli.o 00:05:23.478 LINK hello_world 00:05:23.478 LINK pmr_persistence 00:05:23.478 LINK cmb_copy 00:05:23.478 LINK dif 00:05:23.735 CC examples/blob/hello_world/hello_blob.o 00:05:23.735 LINK hotplug 00:05:23.735 LINK arbitration 00:05:23.735 LINK reconnect 00:05:23.735 LINK abort 00:05:23.735 LINK nvme_manage 00:05:23.993 LINK hello_blob 00:05:23.994 LINK accel_perf 00:05:23.994 LINK iscsi_fuzz 00:05:23.994 LINK blobcli 00:05:23.994 LINK cuse 00:05:24.252 CC test/bdev/bdevio/bdevio.o 00:05:24.512 CC examples/bdev/hello_world/hello_bdev.o 00:05:24.512 CC examples/bdev/bdevperf/bdevperf.o 00:05:24.770 LINK bdevio 00:05:24.770 LINK hello_bdev 00:05:25.338 LINK bdevperf 00:05:25.910 CC examples/nvmf/nvmf/nvmf.o 00:05:26.169 LINK nvmf 00:05:28.700 LINK esnap 00:05:28.700 00:05:28.700 real 1m32.667s 00:05:28.700 user 17m25.579s 00:05:28.700 sys 4m22.938s 00:05:28.700 05:33:43 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:05:28.700 05:33:43 make -- common/autotest_common.sh@10 -- $ set +x 00:05:28.700 ************************************ 00:05:28.700 END TEST make 00:05:28.700 ************************************ 00:05:28.700 05:33:43 -- common/autotest_common.sh@1142 -- $ return 0 00:05:28.700 05:33:43 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:05:28.700 05:33:43 -- pm/common@29 -- $ signal_monitor_resources TERM 00:05:28.700 05:33:43 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:05:28.700 05:33:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:05:28.700 05:33:43 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:05:28.700 05:33:43 -- pm/common@44 -- $ pid=971358 00:05:28.700 05:33:43 -- pm/common@50 -- $ kill -TERM 971358 00:05:28.700 05:33:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:05:28.700 05:33:43 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:05:28.700 05:33:43 -- pm/common@44 -- $ pid=971360 00:05:28.700 05:33:43 -- pm/common@50 -- $ kill -TERM 971360 00:05:28.700 05:33:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:05:28.700 05:33:43 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:05:28.700 05:33:43 -- pm/common@44 -- $ pid=971362 00:05:28.700 05:33:43 -- pm/common@50 -- $ kill -TERM 971362 00:05:28.700 05:33:43 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:05:28.700 05:33:43 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:05:28.700 05:33:43 -- pm/common@44 -- $ pid=971401 00:05:28.700 05:33:43 -- pm/common@50 -- $ sudo -E kill -TERM 971401 00:05:28.957 05:33:43 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:28.957 05:33:43 -- nvmf/common.sh@7 -- # uname -s 00:05:28.957 05:33:43 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:28.957 05:33:43 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:28.957 05:33:43 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:28.957 05:33:43 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:28.957 05:33:43 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:28.957 05:33:43 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:28.957 05:33:43 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:28.957 05:33:43 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:28.957 05:33:43 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:28.957 05:33:43 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:28.957 05:33:43 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:05:28.957 05:33:43 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:05:28.957 05:33:43 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:28.957 05:33:43 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:28.957 05:33:43 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:28.957 05:33:43 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:28.957 05:33:43 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:28.957 05:33:43 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:28.957 05:33:43 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:28.957 05:33:43 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:28.957 05:33:43 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.957 05:33:43 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.958 05:33:43 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.958 05:33:43 -- paths/export.sh@5 -- # export PATH 00:05:28.958 05:33:43 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.958 05:33:43 -- nvmf/common.sh@47 -- # : 0 00:05:28.958 05:33:43 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:28.958 05:33:43 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:28.958 05:33:43 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:28.958 05:33:43 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:28.958 05:33:43 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:28.958 05:33:43 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:28.958 05:33:43 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:28.958 05:33:43 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:28.958 05:33:43 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:05:28.958 05:33:43 -- spdk/autotest.sh@32 -- # uname -s 00:05:28.958 05:33:43 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:05:28.958 05:33:43 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:05:28.958 05:33:43 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:05:28.958 05:33:43 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:05:28.958 05:33:43 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:05:28.958 05:33:43 -- spdk/autotest.sh@44 -- # modprobe nbd 00:05:28.958 05:33:43 -- spdk/autotest.sh@46 -- # type -P udevadm 00:05:28.958 05:33:43 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:05:28.958 05:33:43 -- spdk/autotest.sh@48 -- # udevadm_pid=1038564 00:05:28.958 05:33:43 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:05:28.958 05:33:43 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:05:28.958 05:33:43 -- pm/common@17 -- # local monitor 00:05:28.958 05:33:43 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:05:28.958 05:33:43 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:05:28.958 05:33:43 -- pm/common@21 -- # date +%s 00:05:28.958 05:33:43 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:05:28.958 05:33:43 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:05:28.958 05:33:43 -- pm/common@21 -- # date +%s 00:05:28.958 05:33:43 -- pm/common@25 -- # sleep 1 00:05:28.958 05:33:43 -- pm/common@21 -- # date +%s 00:05:28.958 05:33:43 -- pm/common@21 -- # date +%s 00:05:28.958 05:33:43 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721964823 00:05:28.958 05:33:43 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721964823 00:05:28.958 05:33:43 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721964823 00:05:28.958 05:33:43 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721964823 00:05:28.958 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721964823_collect-vmstat.pm.log 00:05:28.958 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721964823_collect-cpu-load.pm.log 00:05:28.958 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721964823_collect-cpu-temp.pm.log 00:05:28.958 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721964823_collect-bmc-pm.bmc.pm.log 00:05:29.894 05:33:44 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:05:29.894 05:33:44 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:05:29.894 05:33:44 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:29.894 05:33:44 -- common/autotest_common.sh@10 -- # set +x 00:05:29.894 05:33:44 -- spdk/autotest.sh@59 -- # create_test_list 00:05:29.894 05:33:44 -- common/autotest_common.sh@746 -- # xtrace_disable 00:05:29.894 05:33:44 -- common/autotest_common.sh@10 -- # set +x 00:05:30.152 05:33:44 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:05:30.152 05:33:44 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:30.152 05:33:44 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:30.152 05:33:44 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:05:30.152 05:33:44 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:30.152 05:33:44 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:05:30.152 05:33:44 -- common/autotest_common.sh@1455 -- # uname 00:05:30.152 05:33:44 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:05:30.152 05:33:44 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:05:30.152 05:33:44 -- common/autotest_common.sh@1475 -- # uname 00:05:30.152 05:33:44 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:05:30.152 05:33:44 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:05:30.152 05:33:44 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:05:30.152 05:33:44 -- spdk/autotest.sh@72 -- # hash lcov 00:05:30.152 05:33:44 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:05:30.152 05:33:44 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:05:30.152 --rc lcov_branch_coverage=1 00:05:30.152 --rc lcov_function_coverage=1 00:05:30.152 --rc genhtml_branch_coverage=1 00:05:30.152 --rc genhtml_function_coverage=1 00:05:30.152 --rc genhtml_legend=1 00:05:30.152 --rc geninfo_all_blocks=1 00:05:30.152 ' 00:05:30.152 05:33:44 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:05:30.152 --rc lcov_branch_coverage=1 00:05:30.152 --rc lcov_function_coverage=1 00:05:30.152 --rc genhtml_branch_coverage=1 00:05:30.152 --rc genhtml_function_coverage=1 00:05:30.152 --rc genhtml_legend=1 00:05:30.152 --rc geninfo_all_blocks=1 00:05:30.152 ' 00:05:30.152 05:33:44 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:05:30.152 --rc lcov_branch_coverage=1 00:05:30.152 --rc lcov_function_coverage=1 00:05:30.152 --rc genhtml_branch_coverage=1 00:05:30.152 --rc genhtml_function_coverage=1 00:05:30.152 --rc genhtml_legend=1 00:05:30.152 --rc geninfo_all_blocks=1 00:05:30.152 --no-external' 00:05:30.152 05:33:44 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:05:30.152 --rc lcov_branch_coverage=1 00:05:30.152 --rc lcov_function_coverage=1 00:05:30.152 --rc genhtml_branch_coverage=1 00:05:30.152 --rc genhtml_function_coverage=1 00:05:30.152 --rc genhtml_legend=1 00:05:30.152 --rc geninfo_all_blocks=1 00:05:30.153 --no-external' 00:05:30.153 05:33:44 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:05:30.153 lcov: LCOV version 1.14 00:05:30.153 05:33:44 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:05:48.299 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:05:48.299 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:06:00.499 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:06:00.499 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:06:00.500 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:06:00.500 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:06:00.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:06:00.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:06:00.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:06:00.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:06:00.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:06:00.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:06:00.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:06:00.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:06:00.501 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:06:00.501 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:06:05.764 05:34:19 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:06:05.764 05:34:19 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:05.764 05:34:19 -- common/autotest_common.sh@10 -- # set +x 00:06:05.764 05:34:19 -- spdk/autotest.sh@91 -- # rm -f 00:06:05.764 05:34:19 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:08.291 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:08.291 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:08.291 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:06:08.291 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:06:08.291 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:06:08.550 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:06:08.808 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:06:08.808 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:06:08.808 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:06:08.808 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:06:08.808 05:34:23 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:06:08.808 05:34:23 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:06:08.808 05:34:23 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:06:08.808 05:34:23 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:06:08.808 05:34:23 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:06:08.808 05:34:23 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:06:08.808 05:34:23 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:06:08.808 05:34:23 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:08.808 05:34:23 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:06:08.808 05:34:23 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:06:08.808 05:34:23 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:06:08.808 05:34:23 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:06:08.808 05:34:23 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:06:08.808 05:34:23 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:06:08.808 05:34:23 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:06:08.808 No valid GPT data, bailing 00:06:08.808 05:34:23 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:06:08.808 05:34:23 -- scripts/common.sh@391 -- # pt= 00:06:08.808 05:34:23 -- scripts/common.sh@392 -- # return 1 00:06:08.808 05:34:23 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:06:08.808 1+0 records in 00:06:08.808 1+0 records out 00:06:08.808 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00578283 s, 181 MB/s 00:06:08.808 05:34:23 -- spdk/autotest.sh@118 -- # sync 00:06:08.808 05:34:23 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:06:08.808 05:34:23 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:06:08.808 05:34:23 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:06:14.071 05:34:28 -- spdk/autotest.sh@124 -- # uname -s 00:06:14.071 05:34:28 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:06:14.071 05:34:28 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:06:14.071 05:34:28 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.071 05:34:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.071 05:34:28 -- common/autotest_common.sh@10 -- # set +x 00:06:14.071 ************************************ 00:06:14.071 START TEST setup.sh 00:06:14.071 ************************************ 00:06:14.071 05:34:28 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:06:14.071 * Looking for test storage... 00:06:14.071 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:14.071 05:34:28 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:06:14.071 05:34:28 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:06:14.071 05:34:28 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:06:14.071 05:34:28 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.071 05:34:28 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.071 05:34:28 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:14.071 ************************************ 00:06:14.071 START TEST acl 00:06:14.071 ************************************ 00:06:14.071 05:34:28 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:06:14.071 * Looking for test storage... 00:06:14.071 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:14.071 05:34:28 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:06:14.071 05:34:28 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:06:14.071 05:34:28 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:06:14.071 05:34:28 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:06:14.071 05:34:28 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:06:14.071 05:34:28 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:06:14.071 05:34:28 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:06:14.071 05:34:28 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:14.071 05:34:28 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:06:14.071 05:34:28 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:06:14.071 05:34:28 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:06:14.071 05:34:28 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:06:14.071 05:34:28 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:06:14.071 05:34:28 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:06:14.071 05:34:28 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:14.071 05:34:28 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:18.255 05:34:32 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:06:18.255 05:34:32 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:06:18.255 05:34:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:18.255 05:34:32 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:06:18.255 05:34:32 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:06:18.255 05:34:32 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 Hugepages 00:06:21.538 node hugesize free / total 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 00:06:21.538 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.538 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:06:21.539 05:34:36 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:06:21.539 05:34:36 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:21.539 05:34:36 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:21.539 05:34:36 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:06:21.539 ************************************ 00:06:21.539 START TEST denied 00:06:21.539 ************************************ 00:06:21.539 05:34:36 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:06:21.539 05:34:36 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:06:21.539 05:34:36 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:06:21.539 05:34:36 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:06:21.539 05:34:36 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:06:21.539 05:34:36 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:25.799 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:25.799 05:34:40 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:31.065 00:06:31.065 real 0m8.606s 00:06:31.065 user 0m2.826s 00:06:31.065 sys 0m5.085s 00:06:31.065 05:34:45 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:31.065 05:34:45 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:06:31.065 ************************************ 00:06:31.065 END TEST denied 00:06:31.065 ************************************ 00:06:31.065 05:34:45 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:06:31.065 05:34:45 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:06:31.065 05:34:45 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:31.065 05:34:45 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:31.065 05:34:45 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:06:31.065 ************************************ 00:06:31.065 START TEST allowed 00:06:31.065 ************************************ 00:06:31.065 05:34:45 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:06:31.065 05:34:45 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:06:31.065 05:34:45 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:06:31.065 05:34:45 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:06:31.065 05:34:45 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:31.066 05:34:45 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:06:36.330 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:36.589 05:34:51 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:06:36.589 05:34:51 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:06:36.589 05:34:51 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:06:36.589 05:34:51 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:36.589 05:34:51 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:40.781 00:06:40.781 real 0m10.276s 00:06:40.781 user 0m2.818s 00:06:40.781 sys 0m5.278s 00:06:40.781 05:34:55 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:40.781 05:34:55 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:06:40.781 ************************************ 00:06:40.781 END TEST allowed 00:06:40.781 ************************************ 00:06:40.781 05:34:55 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:06:40.781 00:06:40.781 real 0m26.840s 00:06:40.781 user 0m8.481s 00:06:40.781 sys 0m15.775s 00:06:40.781 05:34:55 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:40.781 05:34:55 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:06:40.781 ************************************ 00:06:40.781 END TEST acl 00:06:40.781 ************************************ 00:06:40.781 05:34:55 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:06:40.781 05:34:55 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:06:40.781 05:34:55 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:40.781 05:34:55 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.781 05:34:55 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:40.781 ************************************ 00:06:40.781 START TEST hugepages 00:06:40.781 ************************************ 00:06:40.781 05:34:55 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:06:40.781 * Looking for test storage... 00:06:40.781 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:06:40.781 05:34:55 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 76932648 kB' 'MemAvailable: 80228532 kB' 'Buffers: 12176 kB' 'Cached: 9319008 kB' 'SwapCached: 0 kB' 'Active: 6382012 kB' 'Inactive: 3456348 kB' 'Active(anon): 5988376 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510520 kB' 'Mapped: 172148 kB' 'Shmem: 5481200 kB' 'KReclaimable: 198640 kB' 'Slab: 514424 kB' 'SReclaimable: 198640 kB' 'SUnreclaim: 315784 kB' 'KernelStack: 16224 kB' 'PageTables: 8024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438192 kB' 'Committed_AS: 7419396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200888 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.782 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:06:40.783 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:06:40.784 05:34:55 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:06:40.784 05:34:55 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:40.784 05:34:55 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:40.784 05:34:55 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:06:41.043 ************************************ 00:06:41.043 START TEST default_setup 00:06:41.043 ************************************ 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:06:41.043 05:34:55 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:45.236 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:45.236 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:45.236 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:45.236 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:47.143 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79061144 kB' 'MemAvailable: 82357012 kB' 'Buffers: 12176 kB' 'Cached: 9319128 kB' 'SwapCached: 0 kB' 'Active: 6401696 kB' 'Inactive: 3456348 kB' 'Active(anon): 6008060 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529672 kB' 'Mapped: 172256 kB' 'Shmem: 5481320 kB' 'KReclaimable: 198608 kB' 'Slab: 513404 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314796 kB' 'KernelStack: 16384 kB' 'PageTables: 8508 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7443284 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.143 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.144 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79062380 kB' 'MemAvailable: 82358248 kB' 'Buffers: 12176 kB' 'Cached: 9319128 kB' 'SwapCached: 0 kB' 'Active: 6402116 kB' 'Inactive: 3456348 kB' 'Active(anon): 6008480 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530036 kB' 'Mapped: 172256 kB' 'Shmem: 5481320 kB' 'KReclaimable: 198608 kB' 'Slab: 513404 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314796 kB' 'KernelStack: 16384 kB' 'PageTables: 8748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7443304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.145 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.146 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79060516 kB' 'MemAvailable: 82356384 kB' 'Buffers: 12176 kB' 'Cached: 9319148 kB' 'SwapCached: 0 kB' 'Active: 6401724 kB' 'Inactive: 3456348 kB' 'Active(anon): 6008088 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530056 kB' 'Mapped: 172180 kB' 'Shmem: 5481340 kB' 'KReclaimable: 198608 kB' 'Slab: 513360 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314752 kB' 'KernelStack: 16560 kB' 'PageTables: 8744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7443324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201064 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:01 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.147 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.148 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:06:47.149 nr_hugepages=1024 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:06:47.149 resv_hugepages=0 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:06:47.149 surplus_hugepages=0 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:06:47.149 anon_hugepages=0 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.149 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79059092 kB' 'MemAvailable: 82354960 kB' 'Buffers: 12176 kB' 'Cached: 9319172 kB' 'SwapCached: 0 kB' 'Active: 6401404 kB' 'Inactive: 3456348 kB' 'Active(anon): 6007768 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529668 kB' 'Mapped: 172188 kB' 'Shmem: 5481364 kB' 'KReclaimable: 198608 kB' 'Slab: 513360 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314752 kB' 'KernelStack: 16400 kB' 'PageTables: 8876 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7443348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.150 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.411 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 36435068 kB' 'MemUsed: 11634816 kB' 'SwapCached: 0 kB' 'Active: 5468768 kB' 'Inactive: 3267584 kB' 'Active(anon): 5252768 kB' 'Inactive(anon): 0 kB' 'Active(file): 216000 kB' 'Inactive(file): 3267584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8373420 kB' 'Mapped: 145100 kB' 'AnonPages: 366120 kB' 'Shmem: 4889836 kB' 'KernelStack: 10440 kB' 'PageTables: 5964 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 142964 kB' 'Slab: 341480 kB' 'SReclaimable: 142964 kB' 'SUnreclaim: 198516 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.412 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.413 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:06:47.414 node0=1024 expecting 1024 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:06:47.414 00:06:47.414 real 0m6.379s 00:06:47.414 user 0m1.557s 00:06:47.414 sys 0m2.640s 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:47.414 05:35:02 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:06:47.414 ************************************ 00:06:47.414 END TEST default_setup 00:06:47.414 ************************************ 00:06:47.414 05:35:02 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:06:47.414 05:35:02 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:06:47.414 05:35:02 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:47.414 05:35:02 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:47.414 05:35:02 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:06:47.414 ************************************ 00:06:47.414 START TEST per_node_1G_alloc 00:06:47.414 ************************************ 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:06:47.414 05:35:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:50.764 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:50.764 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:50.764 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:06:50.764 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:06:50.764 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79059528 kB' 'MemAvailable: 82355396 kB' 'Buffers: 12176 kB' 'Cached: 9319264 kB' 'SwapCached: 0 kB' 'Active: 6398888 kB' 'Inactive: 3456348 kB' 'Active(anon): 6005252 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526932 kB' 'Mapped: 171136 kB' 'Shmem: 5481456 kB' 'KReclaimable: 198608 kB' 'Slab: 512828 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314220 kB' 'KernelStack: 16176 kB' 'PageTables: 7944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7429380 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.764 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.765 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79060124 kB' 'MemAvailable: 82355992 kB' 'Buffers: 12176 kB' 'Cached: 9319268 kB' 'SwapCached: 0 kB' 'Active: 6398412 kB' 'Inactive: 3456348 kB' 'Active(anon): 6004776 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526512 kB' 'Mapped: 171128 kB' 'Shmem: 5481460 kB' 'KReclaimable: 198608 kB' 'Slab: 512792 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314184 kB' 'KernelStack: 16144 kB' 'PageTables: 7848 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7429400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.766 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.767 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79060124 kB' 'MemAvailable: 82355992 kB' 'Buffers: 12176 kB' 'Cached: 9319284 kB' 'SwapCached: 0 kB' 'Active: 6398788 kB' 'Inactive: 3456348 kB' 'Active(anon): 6005152 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526880 kB' 'Mapped: 171128 kB' 'Shmem: 5481476 kB' 'KReclaimable: 198608 kB' 'Slab: 512792 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314184 kB' 'KernelStack: 16176 kB' 'PageTables: 7960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7432156 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.768 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.769 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:06:50.770 nr_hugepages=1024 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:06:50.770 resv_hugepages=0 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:06:50.770 surplus_hugepages=0 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:06:50.770 anon_hugepages=0 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:06:50.770 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79061684 kB' 'MemAvailable: 82357552 kB' 'Buffers: 12176 kB' 'Cached: 9319308 kB' 'SwapCached: 0 kB' 'Active: 6398452 kB' 'Inactive: 3456348 kB' 'Active(anon): 6004816 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526516 kB' 'Mapped: 171128 kB' 'Shmem: 5481500 kB' 'KReclaimable: 198608 kB' 'Slab: 512792 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314184 kB' 'KernelStack: 16144 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7429444 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.771 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:06:50.772 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 37486016 kB' 'MemUsed: 10583868 kB' 'SwapCached: 0 kB' 'Active: 5468576 kB' 'Inactive: 3267584 kB' 'Active(anon): 5252576 kB' 'Inactive(anon): 0 kB' 'Active(file): 216000 kB' 'Inactive(file): 3267584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8373424 kB' 'Mapped: 144896 kB' 'AnonPages: 365904 kB' 'Shmem: 4889840 kB' 'KernelStack: 10216 kB' 'PageTables: 5244 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 142964 kB' 'Slab: 340964 kB' 'SReclaimable: 142964 kB' 'SUnreclaim: 198000 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.773 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 41576140 kB' 'MemUsed: 2647460 kB' 'SwapCached: 0 kB' 'Active: 930484 kB' 'Inactive: 188764 kB' 'Active(anon): 752848 kB' 'Inactive(anon): 0 kB' 'Active(file): 177636 kB' 'Inactive(file): 188764 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 958104 kB' 'Mapped: 26732 kB' 'AnonPages: 161220 kB' 'Shmem: 591704 kB' 'KernelStack: 5928 kB' 'PageTables: 2632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 55644 kB' 'Slab: 171828 kB' 'SReclaimable: 55644 kB' 'SUnreclaim: 116184 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.774 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:50.775 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.034 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.035 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:06:51.036 node0=512 expecting 512 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:06:51.036 node1=512 expecting 512 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:06:51.036 00:06:51.036 real 0m3.509s 00:06:51.036 user 0m1.267s 00:06:51.036 sys 0m2.264s 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:51.036 05:35:05 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:06:51.036 ************************************ 00:06:51.036 END TEST per_node_1G_alloc 00:06:51.036 ************************************ 00:06:51.036 05:35:05 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:06:51.036 05:35:05 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:06:51.036 05:35:05 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:51.036 05:35:05 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:51.036 05:35:05 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:06:51.036 ************************************ 00:06:51.036 START TEST even_2G_alloc 00:06:51.036 ************************************ 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:06:51.036 05:35:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:54.320 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:54.320 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:54.320 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:06:54.320 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:06:54.320 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:06:54.320 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:06:54.320 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:06:54.320 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:06:54.321 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79040068 kB' 'MemAvailable: 82335936 kB' 'Buffers: 12176 kB' 'Cached: 9319416 kB' 'SwapCached: 0 kB' 'Active: 6399144 kB' 'Inactive: 3456348 kB' 'Active(anon): 6005508 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527608 kB' 'Mapped: 171072 kB' 'Shmem: 5481608 kB' 'KReclaimable: 198608 kB' 'Slab: 513204 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314596 kB' 'KernelStack: 16144 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7430056 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.584 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:54.585 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79040692 kB' 'MemAvailable: 82336560 kB' 'Buffers: 12176 kB' 'Cached: 9319428 kB' 'SwapCached: 0 kB' 'Active: 6399200 kB' 'Inactive: 3456348 kB' 'Active(anon): 6005564 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527252 kB' 'Mapped: 171044 kB' 'Shmem: 5481620 kB' 'KReclaimable: 198608 kB' 'Slab: 513280 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314672 kB' 'KernelStack: 16192 kB' 'PageTables: 7992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7430072 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.586 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.587 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79041304 kB' 'MemAvailable: 82337172 kB' 'Buffers: 12176 kB' 'Cached: 9319428 kB' 'SwapCached: 0 kB' 'Active: 6398876 kB' 'Inactive: 3456348 kB' 'Active(anon): 6005240 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526836 kB' 'Mapped: 171044 kB' 'Shmem: 5481620 kB' 'KReclaimable: 198608 kB' 'Slab: 513280 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314672 kB' 'KernelStack: 16176 kB' 'PageTables: 7936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7430092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.588 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.589 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:06:54.590 nr_hugepages=1024 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:06:54.590 resv_hugepages=0 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:06:54.590 surplus_hugepages=0 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:06:54.590 anon_hugepages=0 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79042004 kB' 'MemAvailable: 82337872 kB' 'Buffers: 12176 kB' 'Cached: 9319484 kB' 'SwapCached: 0 kB' 'Active: 6398628 kB' 'Inactive: 3456348 kB' 'Active(anon): 6004992 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526576 kB' 'Mapped: 171104 kB' 'Shmem: 5481676 kB' 'KReclaimable: 198608 kB' 'Slab: 513280 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314672 kB' 'KernelStack: 16128 kB' 'PageTables: 7788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7430116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.590 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.591 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 37472984 kB' 'MemUsed: 10596900 kB' 'SwapCached: 0 kB' 'Active: 5467876 kB' 'Inactive: 3267584 kB' 'Active(anon): 5251876 kB' 'Inactive(anon): 0 kB' 'Active(file): 216000 kB' 'Inactive(file): 3267584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8373428 kB' 'Mapped: 144404 kB' 'AnonPages: 365220 kB' 'Shmem: 4889844 kB' 'KernelStack: 10200 kB' 'PageTables: 5216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 142964 kB' 'Slab: 341464 kB' 'SReclaimable: 142964 kB' 'SUnreclaim: 198500 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.592 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.593 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 41570420 kB' 'MemUsed: 2653180 kB' 'SwapCached: 0 kB' 'Active: 931132 kB' 'Inactive: 188764 kB' 'Active(anon): 753496 kB' 'Inactive(anon): 0 kB' 'Active(file): 177636 kB' 'Inactive(file): 188764 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 958260 kB' 'Mapped: 26700 kB' 'AnonPages: 161772 kB' 'Shmem: 591860 kB' 'KernelStack: 5944 kB' 'PageTables: 2628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 55644 kB' 'Slab: 171816 kB' 'SReclaimable: 55644 kB' 'SUnreclaim: 116172 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.853 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:06:54.854 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:06:54.855 node0=512 expecting 512 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:06:54.855 node1=512 expecting 512 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:06:54.855 00:06:54.855 real 0m3.748s 00:06:54.855 user 0m1.404s 00:06:54.855 sys 0m2.415s 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:54.855 05:35:09 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:06:54.855 ************************************ 00:06:54.855 END TEST even_2G_alloc 00:06:54.855 ************************************ 00:06:54.855 05:35:09 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:06:54.855 05:35:09 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:06:54.855 05:35:09 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:54.855 05:35:09 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:54.855 05:35:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:06:54.855 ************************************ 00:06:54.855 START TEST odd_alloc 00:06:54.855 ************************************ 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:06:54.855 05:35:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:58.143 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:58.144 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:58.144 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:06:58.144 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:06:58.144 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79055332 kB' 'MemAvailable: 82351200 kB' 'Buffers: 12176 kB' 'Cached: 9319572 kB' 'SwapCached: 0 kB' 'Active: 6400816 kB' 'Inactive: 3456348 kB' 'Active(anon): 6007180 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528736 kB' 'Mapped: 171220 kB' 'Shmem: 5481764 kB' 'KReclaimable: 198608 kB' 'Slab: 513108 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314500 kB' 'KernelStack: 16128 kB' 'PageTables: 7864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485744 kB' 'Committed_AS: 7430460 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.144 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.145 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79055884 kB' 'MemAvailable: 82351752 kB' 'Buffers: 12176 kB' 'Cached: 9319592 kB' 'SwapCached: 0 kB' 'Active: 6400180 kB' 'Inactive: 3456348 kB' 'Active(anon): 6006544 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528044 kB' 'Mapped: 171116 kB' 'Shmem: 5481784 kB' 'KReclaimable: 198608 kB' 'Slab: 513092 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314484 kB' 'KernelStack: 16112 kB' 'PageTables: 7804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485744 kB' 'Committed_AS: 7430476 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.146 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.147 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79055884 kB' 'MemAvailable: 82351752 kB' 'Buffers: 12176 kB' 'Cached: 9319592 kB' 'SwapCached: 0 kB' 'Active: 6400552 kB' 'Inactive: 3456348 kB' 'Active(anon): 6006916 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528456 kB' 'Mapped: 171116 kB' 'Shmem: 5481784 kB' 'KReclaimable: 198608 kB' 'Slab: 513092 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314484 kB' 'KernelStack: 16128 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485744 kB' 'Committed_AS: 7430496 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:12 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.148 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:06:58.149 nr_hugepages=1025 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:06:58.149 resv_hugepages=0 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:06:58.149 surplus_hugepages=0 00:06:58.149 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:06:58.149 anon_hugepages=0 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79056348 kB' 'MemAvailable: 82352216 kB' 'Buffers: 12176 kB' 'Cached: 9319632 kB' 'SwapCached: 0 kB' 'Active: 6400220 kB' 'Inactive: 3456348 kB' 'Active(anon): 6006584 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528044 kB' 'Mapped: 171116 kB' 'Shmem: 5481824 kB' 'KReclaimable: 198608 kB' 'Slab: 513092 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314484 kB' 'KernelStack: 16112 kB' 'PageTables: 7804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485744 kB' 'Committed_AS: 7430516 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.150 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.151 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 37470224 kB' 'MemUsed: 10599660 kB' 'SwapCached: 0 kB' 'Active: 5467636 kB' 'Inactive: 3267584 kB' 'Active(anon): 5251636 kB' 'Inactive(anon): 0 kB' 'Active(file): 216000 kB' 'Inactive(file): 3267584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8373432 kB' 'Mapped: 144416 kB' 'AnonPages: 364968 kB' 'Shmem: 4889848 kB' 'KernelStack: 10200 kB' 'PageTables: 5120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 142964 kB' 'Slab: 341404 kB' 'SReclaimable: 142964 kB' 'SUnreclaim: 198440 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.413 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.414 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 41584136 kB' 'MemUsed: 2639464 kB' 'SwapCached: 0 kB' 'Active: 932632 kB' 'Inactive: 188764 kB' 'Active(anon): 754996 kB' 'Inactive(anon): 0 kB' 'Active(file): 177636 kB' 'Inactive(file): 188764 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 958412 kB' 'Mapped: 26700 kB' 'AnonPages: 163080 kB' 'Shmem: 592012 kB' 'KernelStack: 5912 kB' 'PageTables: 2684 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 55644 kB' 'Slab: 171688 kB' 'SReclaimable: 55644 kB' 'SUnreclaim: 116044 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.415 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:06:58.416 node0=512 expecting 513 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:06:58.416 node1=513 expecting 512 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:06:58.416 00:06:58.416 real 0m3.523s 00:06:58.416 user 0m1.254s 00:06:58.416 sys 0m2.318s 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.416 05:35:13 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:06:58.416 ************************************ 00:06:58.416 END TEST odd_alloc 00:06:58.416 ************************************ 00:06:58.416 05:35:13 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:06:58.416 05:35:13 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:06:58.416 05:35:13 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:58.416 05:35:13 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.416 05:35:13 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:06:58.416 ************************************ 00:06:58.416 START TEST custom_alloc 00:06:58.416 ************************************ 00:06:58.416 05:35:13 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:06:58.416 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:06:58.417 05:35:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:02.616 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:07:02.617 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:07:02.617 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:07:02.617 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:07:02.617 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 77978632 kB' 'MemAvailable: 81274500 kB' 'Buffers: 12176 kB' 'Cached: 9319720 kB' 'SwapCached: 0 kB' 'Active: 6401492 kB' 'Inactive: 3456348 kB' 'Active(anon): 6007856 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 529140 kB' 'Mapped: 171164 kB' 'Shmem: 5481912 kB' 'KReclaimable: 198608 kB' 'Slab: 513240 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314632 kB' 'KernelStack: 16176 kB' 'PageTables: 8524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962480 kB' 'Committed_AS: 7430996 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.617 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.618 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 77979136 kB' 'MemAvailable: 81275004 kB' 'Buffers: 12176 kB' 'Cached: 9319724 kB' 'SwapCached: 0 kB' 'Active: 6401072 kB' 'Inactive: 3456348 kB' 'Active(anon): 6007436 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528816 kB' 'Mapped: 171132 kB' 'Shmem: 5481916 kB' 'KReclaimable: 198608 kB' 'Slab: 513196 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314588 kB' 'KernelStack: 16128 kB' 'PageTables: 7856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962480 kB' 'Committed_AS: 7431012 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.619 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.620 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 77979136 kB' 'MemAvailable: 81275004 kB' 'Buffers: 12176 kB' 'Cached: 9319740 kB' 'SwapCached: 0 kB' 'Active: 6401080 kB' 'Inactive: 3456348 kB' 'Active(anon): 6007444 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528816 kB' 'Mapped: 171132 kB' 'Shmem: 5481932 kB' 'KReclaimable: 198608 kB' 'Slab: 513196 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314588 kB' 'KernelStack: 16128 kB' 'PageTables: 7856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962480 kB' 'Committed_AS: 7431032 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.621 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:02.622 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:07:02.623 nr_hugepages=1536 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:02.623 resv_hugepages=0 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:02.623 surplus_hugepages=0 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:02.623 anon_hugepages=0 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 77979136 kB' 'MemAvailable: 81275004 kB' 'Buffers: 12176 kB' 'Cached: 9319764 kB' 'SwapCached: 0 kB' 'Active: 6401120 kB' 'Inactive: 3456348 kB' 'Active(anon): 6007484 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528816 kB' 'Mapped: 171132 kB' 'Shmem: 5481956 kB' 'KReclaimable: 198608 kB' 'Slab: 513196 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314588 kB' 'KernelStack: 16128 kB' 'PageTables: 7856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962480 kB' 'Committed_AS: 7431056 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.623 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:16 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:02.624 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 37473096 kB' 'MemUsed: 10596788 kB' 'SwapCached: 0 kB' 'Active: 5469568 kB' 'Inactive: 3267584 kB' 'Active(anon): 5253568 kB' 'Inactive(anon): 0 kB' 'Active(file): 216000 kB' 'Inactive(file): 3267584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8373448 kB' 'Mapped: 144428 kB' 'AnonPages: 366932 kB' 'Shmem: 4889864 kB' 'KernelStack: 10232 kB' 'PageTables: 5268 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 142964 kB' 'Slab: 341424 kB' 'SReclaimable: 142964 kB' 'SUnreclaim: 198460 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.625 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223600 kB' 'MemFree: 40504528 kB' 'MemUsed: 3719072 kB' 'SwapCached: 0 kB' 'Active: 931596 kB' 'Inactive: 188764 kB' 'Active(anon): 753960 kB' 'Inactive(anon): 0 kB' 'Active(file): 177636 kB' 'Inactive(file): 188764 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 958532 kB' 'Mapped: 26704 kB' 'AnonPages: 161884 kB' 'Shmem: 592132 kB' 'KernelStack: 5896 kB' 'PageTables: 2588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 55644 kB' 'Slab: 171772 kB' 'SReclaimable: 55644 kB' 'SUnreclaim: 116128 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.626 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.627 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:07:02.628 node0=512 expecting 512 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:07:02.628 node1=1024 expecting 1024 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:07:02.628 00:07:02.628 real 0m3.871s 00:07:02.628 user 0m1.494s 00:07:02.628 sys 0m2.472s 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:02.628 05:35:17 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:07:02.628 ************************************ 00:07:02.628 END TEST custom_alloc 00:07:02.628 ************************************ 00:07:02.628 05:35:17 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:07:02.628 05:35:17 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:07:02.628 05:35:17 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:02.628 05:35:17 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.628 05:35:17 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:07:02.628 ************************************ 00:07:02.628 START TEST no_shrink_alloc 00:07:02.628 ************************************ 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:07:02.628 05:35:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:05.912 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:07:05.912 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:07:05.912 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:07:05.912 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:07:05.912 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79015696 kB' 'MemAvailable: 82311564 kB' 'Buffers: 12176 kB' 'Cached: 9319876 kB' 'SwapCached: 0 kB' 'Active: 6409160 kB' 'Inactive: 3456348 kB' 'Active(anon): 6015524 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 536728 kB' 'Mapped: 171740 kB' 'Shmem: 5482068 kB' 'KReclaimable: 198608 kB' 'Slab: 512656 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314048 kB' 'KernelStack: 16560 kB' 'PageTables: 9264 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7440248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201068 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.174 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79026424 kB' 'MemAvailable: 82322292 kB' 'Buffers: 12176 kB' 'Cached: 9319880 kB' 'SwapCached: 0 kB' 'Active: 6403984 kB' 'Inactive: 3456348 kB' 'Active(anon): 6010348 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532196 kB' 'Mapped: 171516 kB' 'Shmem: 5482072 kB' 'KReclaimable: 198608 kB' 'Slab: 512648 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314040 kB' 'KernelStack: 16512 kB' 'PageTables: 8352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7434148 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.175 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79025344 kB' 'MemAvailable: 82321212 kB' 'Buffers: 12176 kB' 'Cached: 9319880 kB' 'SwapCached: 0 kB' 'Active: 6403420 kB' 'Inactive: 3456348 kB' 'Active(anon): 6009784 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531016 kB' 'Mapped: 171168 kB' 'Shmem: 5482072 kB' 'KReclaimable: 198608 kB' 'Slab: 512704 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314096 kB' 'KernelStack: 16528 kB' 'PageTables: 9008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7434168 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.176 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:07:06.177 nr_hugepages=1024 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:06.177 resv_hugepages=0 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:06.177 surplus_hugepages=0 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:06.177 anon_hugepages=0 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79027156 kB' 'MemAvailable: 82323024 kB' 'Buffers: 12176 kB' 'Cached: 9319932 kB' 'SwapCached: 0 kB' 'Active: 6403072 kB' 'Inactive: 3456348 kB' 'Active(anon): 6009436 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 530612 kB' 'Mapped: 171168 kB' 'Shmem: 5482124 kB' 'KReclaimable: 198608 kB' 'Slab: 512704 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314096 kB' 'KernelStack: 16176 kB' 'PageTables: 7952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7431592 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200856 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.177 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:06.178 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 36419236 kB' 'MemUsed: 11650648 kB' 'SwapCached: 0 kB' 'Active: 5470232 kB' 'Inactive: 3267584 kB' 'Active(anon): 5254232 kB' 'Inactive(anon): 0 kB' 'Active(file): 216000 kB' 'Inactive(file): 3267584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8373476 kB' 'Mapped: 144444 kB' 'AnonPages: 367500 kB' 'Shmem: 4889892 kB' 'KernelStack: 10216 kB' 'PageTables: 5172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 142964 kB' 'Slab: 341096 kB' 'SReclaimable: 142964 kB' 'SUnreclaim: 198132 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:20 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.179 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:07:06.180 node0=1024 expecting 1024 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:07:06.180 05:35:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:10.432 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:07:10.432 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:07:10.432 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:07:10.432 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:07:10.432 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:07:10.432 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:10.432 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79019256 kB' 'MemAvailable: 82315124 kB' 'Buffers: 12176 kB' 'Cached: 9320008 kB' 'SwapCached: 0 kB' 'Active: 6403744 kB' 'Inactive: 3456348 kB' 'Active(anon): 6010108 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531152 kB' 'Mapped: 171260 kB' 'Shmem: 5482200 kB' 'KReclaimable: 198608 kB' 'Slab: 512456 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 313848 kB' 'KernelStack: 16256 kB' 'PageTables: 7992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7434628 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.433 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79019848 kB' 'MemAvailable: 82315716 kB' 'Buffers: 12176 kB' 'Cached: 9320008 kB' 'SwapCached: 0 kB' 'Active: 6404888 kB' 'Inactive: 3456348 kB' 'Active(anon): 6011252 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532288 kB' 'Mapped: 171260 kB' 'Shmem: 5482200 kB' 'KReclaimable: 198608 kB' 'Slab: 512456 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 313848 kB' 'KernelStack: 16496 kB' 'PageTables: 8960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7433160 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201240 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:10.434 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.435 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:10.436 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79019524 kB' 'MemAvailable: 82315392 kB' 'Buffers: 12176 kB' 'Cached: 9320008 kB' 'SwapCached: 0 kB' 'Active: 6403872 kB' 'Inactive: 3456348 kB' 'Active(anon): 6010236 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531200 kB' 'Mapped: 171228 kB' 'Shmem: 5482200 kB' 'KReclaimable: 198608 kB' 'Slab: 512656 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314048 kB' 'KernelStack: 16304 kB' 'PageTables: 8632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7433184 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201048 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.437 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.438 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:07:10.439 nr_hugepages=1024 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:07:10.439 resv_hugepages=0 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:07:10.439 surplus_hugepages=0 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:07:10.439 anon_hugepages=0 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293484 kB' 'MemFree: 79020948 kB' 'MemAvailable: 82316816 kB' 'Buffers: 12176 kB' 'Cached: 9320048 kB' 'SwapCached: 0 kB' 'Active: 6403664 kB' 'Inactive: 3456348 kB' 'Active(anon): 6010028 kB' 'Inactive(anon): 0 kB' 'Active(file): 393636 kB' 'Inactive(file): 3456348 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531432 kB' 'Mapped: 171176 kB' 'Shmem: 5482240 kB' 'KReclaimable: 198608 kB' 'Slab: 512640 kB' 'SReclaimable: 198608 kB' 'SUnreclaim: 314032 kB' 'KernelStack: 16400 kB' 'PageTables: 8256 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486768 kB' 'Committed_AS: 7433208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 52480 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 779684 kB' 'DirectMap2M: 10430464 kB' 'DirectMap1G: 90177536 kB' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.439 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.440 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069884 kB' 'MemFree: 36411084 kB' 'MemUsed: 11658800 kB' 'SwapCached: 0 kB' 'Active: 5470168 kB' 'Inactive: 3267584 kB' 'Active(anon): 5254168 kB' 'Inactive(anon): 0 kB' 'Active(file): 216000 kB' 'Inactive(file): 3267584 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8373484 kB' 'Mapped: 144452 kB' 'AnonPages: 367352 kB' 'Shmem: 4889900 kB' 'KernelStack: 10504 kB' 'PageTables: 5952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 142964 kB' 'Slab: 341224 kB' 'SReclaimable: 142964 kB' 'SUnreclaim: 198260 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.441 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:07:10.442 node0=1024 expecting 1024 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:07:10.442 00:07:10.442 real 0m7.748s 00:07:10.442 user 0m2.983s 00:07:10.442 sys 0m4.967s 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:10.442 05:35:24 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:07:10.442 ************************************ 00:07:10.443 END TEST no_shrink_alloc 00:07:10.443 ************************************ 00:07:10.443 05:35:24 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:07:10.443 05:35:24 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:07:10.443 00:07:10.443 real 0m29.448s 00:07:10.443 user 0m10.222s 00:07:10.443 sys 0m17.535s 00:07:10.443 05:35:24 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:10.443 05:35:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:07:10.443 ************************************ 00:07:10.443 END TEST hugepages 00:07:10.443 ************************************ 00:07:10.443 05:35:24 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:07:10.443 05:35:24 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:07:10.443 05:35:24 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:10.443 05:35:24 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.443 05:35:24 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:07:10.443 ************************************ 00:07:10.443 START TEST driver 00:07:10.443 ************************************ 00:07:10.443 05:35:25 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:07:10.443 * Looking for test storage... 00:07:10.443 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:07:10.443 05:35:25 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:07:10.443 05:35:25 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:07:10.443 05:35:25 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:07:15.708 05:35:29 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:07:15.708 05:35:29 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:15.709 05:35:29 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:15.709 05:35:29 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:07:15.709 ************************************ 00:07:15.709 START TEST guess_driver 00:07:15.709 ************************************ 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:07:15.709 05:35:29 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:07:15.709 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:07:15.709 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:07:15.709 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:07:15.709 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:07:15.709 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:07:15.709 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:07:15.709 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:07:15.709 Looking for driver=vfio-pci 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:07:15.709 05:35:30 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.991 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:18.992 05:35:33 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:21.525 05:35:35 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:07:21.525 05:35:35 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:07:21.525 05:35:35 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:07:21.525 05:35:35 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:07:21.525 05:35:35 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:07:21.525 05:35:35 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:07:21.525 05:35:35 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:07:26.793 00:07:26.793 real 0m10.750s 00:07:26.793 user 0m2.679s 00:07:26.793 sys 0m5.064s 00:07:26.793 05:35:40 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:26.793 05:35:40 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:07:26.793 ************************************ 00:07:26.793 END TEST guess_driver 00:07:26.793 ************************************ 00:07:26.793 05:35:40 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:07:26.793 00:07:26.793 real 0m15.741s 00:07:26.793 user 0m4.064s 00:07:26.793 sys 0m7.853s 00:07:26.793 05:35:40 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:26.793 05:35:40 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:07:26.793 ************************************ 00:07:26.793 END TEST driver 00:07:26.793 ************************************ 00:07:26.793 05:35:40 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:07:26.793 05:35:40 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:07:26.793 05:35:40 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:26.793 05:35:40 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.793 05:35:40 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:07:26.793 ************************************ 00:07:26.793 START TEST devices 00:07:26.793 ************************************ 00:07:26.793 05:35:40 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:07:26.793 * Looking for test storage... 00:07:26.793 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:07:26.793 05:35:40 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:07:26.793 05:35:40 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:07:26.793 05:35:40 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:07:26.793 05:35:40 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:07:30.078 05:35:44 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:07:30.078 05:35:44 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:07:30.078 No valid GPT data, bailing 00:07:30.078 05:35:44 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:07:30.078 05:35:44 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:07:30.078 05:35:44 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:07:30.078 05:35:44 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:07:30.078 05:35:44 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:07:30.078 05:35:44 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:07:30.078 05:35:44 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:30.078 05:35:44 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:07:30.078 ************************************ 00:07:30.078 START TEST nvme_mount 00:07:30.078 ************************************ 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:07:30.078 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:07:30.079 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:07:30.079 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:07:30.079 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:07:30.079 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:07:30.079 05:35:44 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:07:31.016 Creating new GPT entries in memory. 00:07:31.016 GPT data structures destroyed! You may now partition the disk using fdisk or 00:07:31.016 other utilities. 00:07:31.016 05:35:45 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:07:31.016 05:35:45 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:07:31.016 05:35:45 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:07:31.016 05:35:45 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:07:31.016 05:35:45 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:07:32.391 Creating new GPT entries in memory. 00:07:32.391 The operation has completed successfully. 00:07:32.391 05:35:46 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:07:32.391 05:35:46 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:07:32.391 05:35:46 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1071370 00:07:32.391 05:35:46 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:32.391 05:35:46 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:07:32.391 05:35:46 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:32.391 05:35:46 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:07:32.391 05:35:46 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:07:32.391 05:35:46 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:07:32.391 05:35:47 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:35.679 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:07:35.937 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:07:35.937 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:07:36.196 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:07:36.196 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:07:36.196 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:07:36.196 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:07:36.196 05:35:50 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:07:36.196 05:35:50 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:07:36.196 05:35:50 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:07:36.196 05:35:51 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:07:40.414 05:35:54 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:43.698 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:07:43.699 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:07:43.699 00:07:43.699 real 0m13.681s 00:07:43.699 user 0m4.139s 00:07:43.699 sys 0m7.529s 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:43.699 05:35:58 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:07:43.699 ************************************ 00:07:43.699 END TEST nvme_mount 00:07:43.699 ************************************ 00:07:43.957 05:35:58 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:07:43.957 05:35:58 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:07:43.957 05:35:58 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:43.957 05:35:58 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:43.957 05:35:58 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:07:43.957 ************************************ 00:07:43.957 START TEST dm_mount 00:07:43.957 ************************************ 00:07:43.957 05:35:58 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:07:43.957 05:35:58 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:07:43.957 05:35:58 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:07:43.957 05:35:58 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:07:43.957 05:35:58 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:07:43.957 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:07:43.957 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:07:43.958 05:35:58 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:07:44.893 Creating new GPT entries in memory. 00:07:44.893 GPT data structures destroyed! You may now partition the disk using fdisk or 00:07:44.893 other utilities. 00:07:44.893 05:35:59 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:07:44.893 05:35:59 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:07:44.893 05:35:59 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:07:44.893 05:35:59 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:07:44.893 05:35:59 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:07:45.828 Creating new GPT entries in memory. 00:07:45.828 The operation has completed successfully. 00:07:45.828 05:36:00 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:07:45.828 05:36:00 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:07:45.828 05:36:00 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:07:45.828 05:36:00 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:07:45.828 05:36:00 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:07:47.204 The operation has completed successfully. 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1075646 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:07:47.204 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:07:47.205 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:07:47.205 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:07:47.205 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:07:47.205 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:07:47.205 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:47.205 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:07:47.205 05:36:01 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:07:47.205 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:07:47.205 05:36:01 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.487 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.488 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.488 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.488 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:50.488 05:36:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:07:50.488 05:36:05 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.770 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:07:53.771 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:07:53.771 00:07:53.771 real 0m9.869s 00:07:53.771 user 0m2.201s 00:07:53.771 sys 0m4.687s 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:53.771 05:36:08 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:07:53.771 ************************************ 00:07:53.771 END TEST dm_mount 00:07:53.771 ************************************ 00:07:53.771 05:36:08 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:07:53.771 05:36:08 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:07:53.771 05:36:08 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:07:53.771 05:36:08 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:07:53.771 05:36:08 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:07:53.771 05:36:08 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:07:53.771 05:36:08 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:07:53.771 05:36:08 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:07:54.029 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:07:54.029 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:07:54.029 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:07:54.029 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:07:54.029 05:36:08 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:07:54.029 05:36:08 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:07:54.029 05:36:08 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:07:54.029 05:36:08 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:07:54.029 05:36:08 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:07:54.029 05:36:08 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:07:54.029 05:36:08 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:07:54.029 00:07:54.029 real 0m28.014s 00:07:54.029 user 0m7.803s 00:07:54.029 sys 0m15.115s 00:07:54.029 05:36:08 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:54.029 05:36:08 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:07:54.029 ************************************ 00:07:54.029 END TEST devices 00:07:54.029 ************************************ 00:07:54.029 05:36:08 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:07:54.029 00:07:54.030 real 1m40.489s 00:07:54.030 user 0m30.733s 00:07:54.030 sys 0m56.596s 00:07:54.030 05:36:08 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:54.030 05:36:08 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:07:54.030 ************************************ 00:07:54.030 END TEST setup.sh 00:07:54.030 ************************************ 00:07:54.287 05:36:08 -- common/autotest_common.sh@1142 -- # return 0 00:07:54.287 05:36:08 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:07:57.573 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:07:57.573 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:07:57.573 Hugepages 00:07:57.573 node hugesize free / total 00:07:57.573 node0 1048576kB 0 / 0 00:07:57.573 node0 2048kB 1024 / 1024 00:07:57.573 node1 1048576kB 0 / 0 00:07:57.573 node1 2048kB 1024 / 1024 00:07:57.573 00:07:57.573 Type BDF Vendor Device NUMA Driver Device Block devices 00:07:57.573 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:07:57.573 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:07:57.573 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:07:57.573 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:07:57.573 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:07:57.573 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:07:57.573 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:07:57.573 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:07:57.832 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:07:57.832 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:07:57.832 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:07:57.832 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:07:57.832 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:07:57.832 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:07:57.832 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:07:57.832 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:07:57.832 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:07:57.832 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:07:57.832 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:07:57.832 05:36:12 -- spdk/autotest.sh@130 -- # uname -s 00:07:57.832 05:36:12 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:07:57.832 05:36:12 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:07:57.832 05:36:12 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:08:01.187 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:08:01.187 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:08:01.187 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:01.187 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:01.446 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:01.446 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:01.446 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:03.981 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:08:03.981 05:36:18 -- common/autotest_common.sh@1532 -- # sleep 1 00:08:04.918 05:36:19 -- common/autotest_common.sh@1533 -- # bdfs=() 00:08:04.918 05:36:19 -- common/autotest_common.sh@1533 -- # local bdfs 00:08:04.918 05:36:19 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:08:04.918 05:36:19 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:08:04.918 05:36:19 -- common/autotest_common.sh@1513 -- # bdfs=() 00:08:04.918 05:36:19 -- common/autotest_common.sh@1513 -- # local bdfs 00:08:04.918 05:36:19 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:08:04.918 05:36:19 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:08:04.918 05:36:19 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:08:04.918 05:36:19 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:08:04.918 05:36:19 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:08:04.918 05:36:19 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:08:08.209 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:08:08.209 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:08:08.209 Waiting for block devices as requested 00:08:08.209 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:08:08.467 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:08.467 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:08.467 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:08.727 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:08.727 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:08.727 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:08.986 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:08.986 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:08.986 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:09.244 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:09.244 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:09.244 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:09.502 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:09.502 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:09.502 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:09.761 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:09.761 05:36:24 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:08:09.761 05:36:24 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:08:09.761 05:36:24 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:08:09.761 05:36:24 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:08:09.761 05:36:24 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:08:09.761 05:36:24 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:08:09.761 05:36:24 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:08:09.761 05:36:24 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:08:09.761 05:36:24 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:08:09.761 05:36:24 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:08:09.761 05:36:24 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:08:09.761 05:36:24 -- common/autotest_common.sh@1545 -- # grep oacs 00:08:09.761 05:36:24 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:08:09.761 05:36:24 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:08:09.761 05:36:24 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:08:09.761 05:36:24 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:08:09.761 05:36:24 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:08:09.761 05:36:24 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:08:09.761 05:36:24 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:08:09.761 05:36:24 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:08:09.761 05:36:24 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:08:09.761 05:36:24 -- common/autotest_common.sh@1557 -- # continue 00:08:09.761 05:36:24 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:08:09.761 05:36:24 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:09.761 05:36:24 -- common/autotest_common.sh@10 -- # set +x 00:08:09.761 05:36:24 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:08:09.761 05:36:24 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:09.761 05:36:24 -- common/autotest_common.sh@10 -- # set +x 00:08:09.761 05:36:24 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:08:13.046 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:08:13.046 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:08:13.046 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:08:13.046 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:08:15.576 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:08:15.576 05:36:30 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:08:15.576 05:36:30 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:15.576 05:36:30 -- common/autotest_common.sh@10 -- # set +x 00:08:15.576 05:36:30 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:08:15.576 05:36:30 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:08:15.576 05:36:30 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:08:15.576 05:36:30 -- common/autotest_common.sh@1577 -- # bdfs=() 00:08:15.576 05:36:30 -- common/autotest_common.sh@1577 -- # local bdfs 00:08:15.576 05:36:30 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:08:15.576 05:36:30 -- common/autotest_common.sh@1513 -- # bdfs=() 00:08:15.576 05:36:30 -- common/autotest_common.sh@1513 -- # local bdfs 00:08:15.576 05:36:30 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:08:15.576 05:36:30 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:08:15.576 05:36:30 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:08:15.835 05:36:30 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:08:15.835 05:36:30 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:08:15.835 05:36:30 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:08:15.835 05:36:30 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:08:15.835 05:36:30 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:08:15.835 05:36:30 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:08:15.835 05:36:30 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:08:15.835 05:36:30 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:08:15.835 05:36:30 -- common/autotest_common.sh@1593 -- # return 0 00:08:15.835 05:36:30 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:08:15.835 05:36:30 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:08:15.835 05:36:30 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:08:15.835 05:36:30 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:08:15.835 05:36:30 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:08:16.431 Restarting all devices. 00:08:19.716 lstat() error: No such file or directory 00:08:19.716 QAT Error: No GENERAL section found 00:08:19.716 Failed to configure qat_dev0 00:08:19.716 lstat() error: No such file or directory 00:08:19.716 QAT Error: No GENERAL section found 00:08:19.716 Failed to configure qat_dev1 00:08:19.716 lstat() error: No such file or directory 00:08:19.716 QAT Error: No GENERAL section found 00:08:19.716 Failed to configure qat_dev2 00:08:19.716 enable sriov 00:08:19.716 Checking status of all devices. 00:08:19.716 There is 3 QAT acceleration device(s) in the system: 00:08:19.716 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:08:19.716 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:08:19.716 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:08:20.653 0000:3d:00.0 set to 16 VFs 00:08:21.589 0000:3f:00.0 set to 16 VFs 00:08:22.215 0000:da:00.0 set to 16 VFs 00:08:23.591 Properly configured the qat device with driver uio_pci_generic. 00:08:23.591 05:36:38 -- spdk/autotest.sh@162 -- # timing_enter lib 00:08:23.591 05:36:38 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:23.591 05:36:38 -- common/autotest_common.sh@10 -- # set +x 00:08:23.591 05:36:38 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:08:23.591 05:36:38 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:08:23.591 05:36:38 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:23.591 05:36:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.591 05:36:38 -- common/autotest_common.sh@10 -- # set +x 00:08:23.591 ************************************ 00:08:23.591 START TEST env 00:08:23.591 ************************************ 00:08:23.591 05:36:38 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:08:23.850 * Looking for test storage... 00:08:23.850 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:08:23.850 05:36:38 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:08:23.850 05:36:38 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:23.850 05:36:38 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.850 05:36:38 env -- common/autotest_common.sh@10 -- # set +x 00:08:23.850 ************************************ 00:08:23.850 START TEST env_memory 00:08:23.850 ************************************ 00:08:23.850 05:36:38 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:08:23.850 00:08:23.850 00:08:23.850 CUnit - A unit testing framework for C - Version 2.1-3 00:08:23.850 http://cunit.sourceforge.net/ 00:08:23.850 00:08:23.850 00:08:23.850 Suite: memory 00:08:23.850 Test: alloc and free memory map ...[2024-07-26 05:36:38.595585] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:08:23.850 passed 00:08:23.850 Test: mem map translation ...[2024-07-26 05:36:38.615556] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:08:23.850 [2024-07-26 05:36:38.615575] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:08:23.850 [2024-07-26 05:36:38.615613] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:08:23.850 [2024-07-26 05:36:38.615623] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:08:23.850 passed 00:08:23.850 Test: mem map registration ...[2024-07-26 05:36:38.653945] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:08:23.850 [2024-07-26 05:36:38.653962] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:08:23.850 passed 00:08:23.850 Test: mem map adjacent registrations ...passed 00:08:23.850 00:08:23.850 Run Summary: Type Total Ran Passed Failed Inactive 00:08:23.850 suites 1 1 n/a 0 0 00:08:23.850 tests 4 4 4 0 0 00:08:23.850 asserts 152 152 152 0 n/a 00:08:23.850 00:08:23.850 Elapsed time = 0.130 seconds 00:08:23.850 00:08:23.850 real 0m0.137s 00:08:23.850 user 0m0.130s 00:08:23.850 sys 0m0.007s 00:08:23.850 05:36:38 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:23.850 05:36:38 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:08:23.850 ************************************ 00:08:23.850 END TEST env_memory 00:08:23.850 ************************************ 00:08:23.850 05:36:38 env -- common/autotest_common.sh@1142 -- # return 0 00:08:23.850 05:36:38 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:08:23.850 05:36:38 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:23.850 05:36:38 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.850 05:36:38 env -- common/autotest_common.sh@10 -- # set +x 00:08:24.110 ************************************ 00:08:24.110 START TEST env_vtophys 00:08:24.110 ************************************ 00:08:24.110 05:36:38 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:08:24.110 EAL: lib.eal log level changed from notice to debug 00:08:24.110 EAL: Detected lcore 0 as core 0 on socket 0 00:08:24.110 EAL: Detected lcore 1 as core 1 on socket 0 00:08:24.110 EAL: Detected lcore 2 as core 2 on socket 0 00:08:24.110 EAL: Detected lcore 3 as core 3 on socket 0 00:08:24.110 EAL: Detected lcore 4 as core 4 on socket 0 00:08:24.110 EAL: Detected lcore 5 as core 8 on socket 0 00:08:24.110 EAL: Detected lcore 6 as core 9 on socket 0 00:08:24.110 EAL: Detected lcore 7 as core 10 on socket 0 00:08:24.110 EAL: Detected lcore 8 as core 11 on socket 0 00:08:24.110 EAL: Detected lcore 9 as core 16 on socket 0 00:08:24.110 EAL: Detected lcore 10 as core 17 on socket 0 00:08:24.110 EAL: Detected lcore 11 as core 18 on socket 0 00:08:24.110 EAL: Detected lcore 12 as core 19 on socket 0 00:08:24.110 EAL: Detected lcore 13 as core 20 on socket 0 00:08:24.110 EAL: Detected lcore 14 as core 24 on socket 0 00:08:24.110 EAL: Detected lcore 15 as core 25 on socket 0 00:08:24.110 EAL: Detected lcore 16 as core 26 on socket 0 00:08:24.110 EAL: Detected lcore 17 as core 27 on socket 0 00:08:24.110 EAL: Detected lcore 18 as core 0 on socket 1 00:08:24.110 EAL: Detected lcore 19 as core 1 on socket 1 00:08:24.110 EAL: Detected lcore 20 as core 2 on socket 1 00:08:24.110 EAL: Detected lcore 21 as core 3 on socket 1 00:08:24.110 EAL: Detected lcore 22 as core 4 on socket 1 00:08:24.110 EAL: Detected lcore 23 as core 8 on socket 1 00:08:24.110 EAL: Detected lcore 24 as core 9 on socket 1 00:08:24.110 EAL: Detected lcore 25 as core 10 on socket 1 00:08:24.110 EAL: Detected lcore 26 as core 11 on socket 1 00:08:24.110 EAL: Detected lcore 27 as core 16 on socket 1 00:08:24.110 EAL: Detected lcore 28 as core 17 on socket 1 00:08:24.110 EAL: Detected lcore 29 as core 18 on socket 1 00:08:24.110 EAL: Detected lcore 30 as core 19 on socket 1 00:08:24.110 EAL: Detected lcore 31 as core 20 on socket 1 00:08:24.110 EAL: Detected lcore 32 as core 24 on socket 1 00:08:24.110 EAL: Detected lcore 33 as core 25 on socket 1 00:08:24.110 EAL: Detected lcore 34 as core 26 on socket 1 00:08:24.110 EAL: Detected lcore 35 as core 27 on socket 1 00:08:24.110 EAL: Detected lcore 36 as core 0 on socket 0 00:08:24.110 EAL: Detected lcore 37 as core 1 on socket 0 00:08:24.110 EAL: Detected lcore 38 as core 2 on socket 0 00:08:24.110 EAL: Detected lcore 39 as core 3 on socket 0 00:08:24.110 EAL: Detected lcore 40 as core 4 on socket 0 00:08:24.111 EAL: Detected lcore 41 as core 8 on socket 0 00:08:24.111 EAL: Detected lcore 42 as core 9 on socket 0 00:08:24.111 EAL: Detected lcore 43 as core 10 on socket 0 00:08:24.111 EAL: Detected lcore 44 as core 11 on socket 0 00:08:24.111 EAL: Detected lcore 45 as core 16 on socket 0 00:08:24.111 EAL: Detected lcore 46 as core 17 on socket 0 00:08:24.111 EAL: Detected lcore 47 as core 18 on socket 0 00:08:24.111 EAL: Detected lcore 48 as core 19 on socket 0 00:08:24.111 EAL: Detected lcore 49 as core 20 on socket 0 00:08:24.111 EAL: Detected lcore 50 as core 24 on socket 0 00:08:24.111 EAL: Detected lcore 51 as core 25 on socket 0 00:08:24.111 EAL: Detected lcore 52 as core 26 on socket 0 00:08:24.111 EAL: Detected lcore 53 as core 27 on socket 0 00:08:24.111 EAL: Detected lcore 54 as core 0 on socket 1 00:08:24.111 EAL: Detected lcore 55 as core 1 on socket 1 00:08:24.111 EAL: Detected lcore 56 as core 2 on socket 1 00:08:24.111 EAL: Detected lcore 57 as core 3 on socket 1 00:08:24.111 EAL: Detected lcore 58 as core 4 on socket 1 00:08:24.111 EAL: Detected lcore 59 as core 8 on socket 1 00:08:24.111 EAL: Detected lcore 60 as core 9 on socket 1 00:08:24.111 EAL: Detected lcore 61 as core 10 on socket 1 00:08:24.111 EAL: Detected lcore 62 as core 11 on socket 1 00:08:24.111 EAL: Detected lcore 63 as core 16 on socket 1 00:08:24.111 EAL: Detected lcore 64 as core 17 on socket 1 00:08:24.111 EAL: Detected lcore 65 as core 18 on socket 1 00:08:24.111 EAL: Detected lcore 66 as core 19 on socket 1 00:08:24.111 EAL: Detected lcore 67 as core 20 on socket 1 00:08:24.111 EAL: Detected lcore 68 as core 24 on socket 1 00:08:24.111 EAL: Detected lcore 69 as core 25 on socket 1 00:08:24.111 EAL: Detected lcore 70 as core 26 on socket 1 00:08:24.111 EAL: Detected lcore 71 as core 27 on socket 1 00:08:24.111 EAL: Maximum logical cores by configuration: 128 00:08:24.111 EAL: Detected CPU lcores: 72 00:08:24.111 EAL: Detected NUMA nodes: 2 00:08:24.111 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:08:24.111 EAL: Detected shared linkage of DPDK 00:08:24.111 EAL: No shared files mode enabled, IPC will be disabled 00:08:24.111 EAL: No shared files mode enabled, IPC is disabled 00:08:24.111 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:08:24.111 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:08:24.111 EAL: Bus pci wants IOVA as 'PA' 00:08:24.111 EAL: Bus auxiliary wants IOVA as 'DC' 00:08:24.111 EAL: Bus vdev wants IOVA as 'DC' 00:08:24.111 EAL: Selected IOVA mode 'PA' 00:08:24.111 EAL: Probing VFIO support... 00:08:24.111 EAL: IOMMU type 1 (Type 1) is supported 00:08:24.111 EAL: IOMMU type 7 (sPAPR) is not supported 00:08:24.111 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:08:24.111 EAL: VFIO support initialized 00:08:24.111 EAL: Ask a virtual area of 0x2e000 bytes 00:08:24.111 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:08:24.111 EAL: Setting up physically contiguous memory... 00:08:24.111 EAL: Setting maximum number of open files to 524288 00:08:24.111 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:08:24.111 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:08:24.111 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:08:24.111 EAL: Ask a virtual area of 0x61000 bytes 00:08:24.111 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:08:24.111 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:08:24.111 EAL: Ask a virtual area of 0x400000000 bytes 00:08:24.111 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:08:24.111 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:08:24.111 EAL: Ask a virtual area of 0x61000 bytes 00:08:24.111 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:08:24.111 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:08:24.111 EAL: Ask a virtual area of 0x400000000 bytes 00:08:24.111 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:08:24.111 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:08:24.111 EAL: Ask a virtual area of 0x61000 bytes 00:08:24.111 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:08:24.111 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:08:24.111 EAL: Ask a virtual area of 0x400000000 bytes 00:08:24.111 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:08:24.111 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:08:24.111 EAL: Ask a virtual area of 0x61000 bytes 00:08:24.111 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:08:24.111 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:08:24.111 EAL: Ask a virtual area of 0x400000000 bytes 00:08:24.111 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:08:24.111 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:08:24.111 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:08:24.111 EAL: Ask a virtual area of 0x61000 bytes 00:08:24.111 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:08:24.111 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:08:24.111 EAL: Ask a virtual area of 0x400000000 bytes 00:08:24.111 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:08:24.111 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:08:24.111 EAL: Ask a virtual area of 0x61000 bytes 00:08:24.112 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:08:24.112 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:08:24.112 EAL: Ask a virtual area of 0x400000000 bytes 00:08:24.112 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:08:24.112 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:08:24.112 EAL: Ask a virtual area of 0x61000 bytes 00:08:24.112 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:08:24.112 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:08:24.112 EAL: Ask a virtual area of 0x400000000 bytes 00:08:24.112 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:08:24.112 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:08:24.112 EAL: Ask a virtual area of 0x61000 bytes 00:08:24.112 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:08:24.112 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:08:24.112 EAL: Ask a virtual area of 0x400000000 bytes 00:08:24.112 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:08:24.112 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:08:24.112 EAL: Hugepages will be freed exactly as allocated. 00:08:24.112 EAL: No shared files mode enabled, IPC is disabled 00:08:24.112 EAL: No shared files mode enabled, IPC is disabled 00:08:24.112 EAL: TSC frequency is ~2300000 KHz 00:08:24.112 EAL: Main lcore 0 is ready (tid=7f0746e20b00;cpuset=[0]) 00:08:24.112 EAL: Trying to obtain current memory policy. 00:08:24.112 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.112 EAL: Restoring previous memory policy: 0 00:08:24.112 EAL: request: mp_malloc_sync 00:08:24.112 EAL: No shared files mode enabled, IPC is disabled 00:08:24.112 EAL: Heap on socket 0 was expanded by 2MB 00:08:24.112 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001000000 00:08:24.112 EAL: PCI memory mapped at 0x202001001000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001002000 00:08:24.112 EAL: PCI memory mapped at 0x202001003000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001004000 00:08:24.112 EAL: PCI memory mapped at 0x202001005000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001006000 00:08:24.112 EAL: PCI memory mapped at 0x202001007000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001008000 00:08:24.112 EAL: PCI memory mapped at 0x202001009000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x20200100a000 00:08:24.112 EAL: PCI memory mapped at 0x20200100b000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x20200100c000 00:08:24.112 EAL: PCI memory mapped at 0x20200100d000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x20200100e000 00:08:24.112 EAL: PCI memory mapped at 0x20200100f000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001010000 00:08:24.112 EAL: PCI memory mapped at 0x202001011000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001012000 00:08:24.112 EAL: PCI memory mapped at 0x202001013000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001014000 00:08:24.112 EAL: PCI memory mapped at 0x202001015000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001016000 00:08:24.112 EAL: PCI memory mapped at 0x202001017000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001018000 00:08:24.112 EAL: PCI memory mapped at 0x202001019000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x20200101a000 00:08:24.112 EAL: PCI memory mapped at 0x20200101b000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x20200101c000 00:08:24.112 EAL: PCI memory mapped at 0x20200101d000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:08:24.112 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x20200101e000 00:08:24.112 EAL: PCI memory mapped at 0x20200101f000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001020000 00:08:24.112 EAL: PCI memory mapped at 0x202001021000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001022000 00:08:24.112 EAL: PCI memory mapped at 0x202001023000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001024000 00:08:24.112 EAL: PCI memory mapped at 0x202001025000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001026000 00:08:24.112 EAL: PCI memory mapped at 0x202001027000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001028000 00:08:24.112 EAL: PCI memory mapped at 0x202001029000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x20200102a000 00:08:24.112 EAL: PCI memory mapped at 0x20200102b000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x20200102c000 00:08:24.112 EAL: PCI memory mapped at 0x20200102d000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x20200102e000 00:08:24.112 EAL: PCI memory mapped at 0x20200102f000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001030000 00:08:24.112 EAL: PCI memory mapped at 0x202001031000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.112 EAL: PCI memory mapped at 0x202001032000 00:08:24.112 EAL: PCI memory mapped at 0x202001033000 00:08:24.112 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:08:24.112 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:08:24.112 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001034000 00:08:24.113 EAL: PCI memory mapped at 0x202001035000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:08:24.113 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001036000 00:08:24.113 EAL: PCI memory mapped at 0x202001037000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:08:24.113 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001038000 00:08:24.113 EAL: PCI memory mapped at 0x202001039000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:08:24.113 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x20200103a000 00:08:24.113 EAL: PCI memory mapped at 0x20200103b000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:08:24.113 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x20200103c000 00:08:24.113 EAL: PCI memory mapped at 0x20200103d000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:08:24.113 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x20200103e000 00:08:24.113 EAL: PCI memory mapped at 0x20200103f000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:08:24.113 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001040000 00:08:24.113 EAL: PCI memory mapped at 0x202001041000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:08:24.113 EAL: Trying to obtain current memory policy. 00:08:24.113 EAL: Setting policy MPOL_PREFERRED for socket 1 00:08:24.113 EAL: Restoring previous memory policy: 4 00:08:24.113 EAL: request: mp_malloc_sync 00:08:24.113 EAL: No shared files mode enabled, IPC is disabled 00:08:24.113 EAL: Heap on socket 1 was expanded by 2MB 00:08:24.113 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001042000 00:08:24.113 EAL: PCI memory mapped at 0x202001043000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001044000 00:08:24.113 EAL: PCI memory mapped at 0x202001045000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001046000 00:08:24.113 EAL: PCI memory mapped at 0x202001047000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001048000 00:08:24.113 EAL: PCI memory mapped at 0x202001049000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x20200104a000 00:08:24.113 EAL: PCI memory mapped at 0x20200104b000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x20200104c000 00:08:24.113 EAL: PCI memory mapped at 0x20200104d000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x20200104e000 00:08:24.113 EAL: PCI memory mapped at 0x20200104f000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001050000 00:08:24.113 EAL: PCI memory mapped at 0x202001051000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001052000 00:08:24.113 EAL: PCI memory mapped at 0x202001053000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001054000 00:08:24.113 EAL: PCI memory mapped at 0x202001055000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001056000 00:08:24.113 EAL: PCI memory mapped at 0x202001057000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x202001058000 00:08:24.113 EAL: PCI memory mapped at 0x202001059000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x20200105a000 00:08:24.113 EAL: PCI memory mapped at 0x20200105b000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x20200105c000 00:08:24.113 EAL: PCI memory mapped at 0x20200105d000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:08:24.113 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:08:24.113 EAL: probe driver: 8086:37c9 qat 00:08:24.113 EAL: PCI memory mapped at 0x20200105e000 00:08:24.113 EAL: PCI memory mapped at 0x20200105f000 00:08:24.113 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:08:24.113 EAL: No shared files mode enabled, IPC is disabled 00:08:24.113 EAL: No shared files mode enabled, IPC is disabled 00:08:24.113 EAL: No PCI address specified using 'addr=' in: bus=pci 00:08:24.113 EAL: Mem event callback 'spdk:(nil)' registered 00:08:24.113 00:08:24.113 00:08:24.113 CUnit - A unit testing framework for C - Version 2.1-3 00:08:24.113 http://cunit.sourceforge.net/ 00:08:24.113 00:08:24.113 00:08:24.113 Suite: components_suite 00:08:24.113 Test: vtophys_malloc_test ...passed 00:08:24.113 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:08:24.113 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.113 EAL: Restoring previous memory policy: 4 00:08:24.113 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.113 EAL: request: mp_malloc_sync 00:08:24.113 EAL: No shared files mode enabled, IPC is disabled 00:08:24.113 EAL: Heap on socket 0 was expanded by 4MB 00:08:24.113 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.113 EAL: request: mp_malloc_sync 00:08:24.113 EAL: No shared files mode enabled, IPC is disabled 00:08:24.113 EAL: Heap on socket 0 was shrunk by 4MB 00:08:24.113 EAL: Trying to obtain current memory policy. 00:08:24.113 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.113 EAL: Restoring previous memory policy: 4 00:08:24.113 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.113 EAL: request: mp_malloc_sync 00:08:24.113 EAL: No shared files mode enabled, IPC is disabled 00:08:24.113 EAL: Heap on socket 0 was expanded by 6MB 00:08:24.113 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.113 EAL: request: mp_malloc_sync 00:08:24.113 EAL: No shared files mode enabled, IPC is disabled 00:08:24.113 EAL: Heap on socket 0 was shrunk by 6MB 00:08:24.113 EAL: Trying to obtain current memory policy. 00:08:24.113 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.113 EAL: Restoring previous memory policy: 4 00:08:24.113 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.113 EAL: request: mp_malloc_sync 00:08:24.113 EAL: No shared files mode enabled, IPC is disabled 00:08:24.113 EAL: Heap on socket 0 was expanded by 10MB 00:08:24.114 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.114 EAL: request: mp_malloc_sync 00:08:24.114 EAL: No shared files mode enabled, IPC is disabled 00:08:24.114 EAL: Heap on socket 0 was shrunk by 10MB 00:08:24.114 EAL: Trying to obtain current memory policy. 00:08:24.114 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.114 EAL: Restoring previous memory policy: 4 00:08:24.114 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.114 EAL: request: mp_malloc_sync 00:08:24.114 EAL: No shared files mode enabled, IPC is disabled 00:08:24.114 EAL: Heap on socket 0 was expanded by 18MB 00:08:24.114 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.114 EAL: request: mp_malloc_sync 00:08:24.114 EAL: No shared files mode enabled, IPC is disabled 00:08:24.114 EAL: Heap on socket 0 was shrunk by 18MB 00:08:24.114 EAL: Trying to obtain current memory policy. 00:08:24.114 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.114 EAL: Restoring previous memory policy: 4 00:08:24.114 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.114 EAL: request: mp_malloc_sync 00:08:24.114 EAL: No shared files mode enabled, IPC is disabled 00:08:24.114 EAL: Heap on socket 0 was expanded by 34MB 00:08:24.114 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.114 EAL: request: mp_malloc_sync 00:08:24.114 EAL: No shared files mode enabled, IPC is disabled 00:08:24.114 EAL: Heap on socket 0 was shrunk by 34MB 00:08:24.114 EAL: Trying to obtain current memory policy. 00:08:24.114 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.114 EAL: Restoring previous memory policy: 4 00:08:24.114 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.114 EAL: request: mp_malloc_sync 00:08:24.114 EAL: No shared files mode enabled, IPC is disabled 00:08:24.114 EAL: Heap on socket 0 was expanded by 66MB 00:08:24.114 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.114 EAL: request: mp_malloc_sync 00:08:24.114 EAL: No shared files mode enabled, IPC is disabled 00:08:24.114 EAL: Heap on socket 0 was shrunk by 66MB 00:08:24.114 EAL: Trying to obtain current memory policy. 00:08:24.114 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.373 EAL: Restoring previous memory policy: 4 00:08:24.373 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.373 EAL: request: mp_malloc_sync 00:08:24.373 EAL: No shared files mode enabled, IPC is disabled 00:08:24.373 EAL: Heap on socket 0 was expanded by 130MB 00:08:24.373 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.373 EAL: request: mp_malloc_sync 00:08:24.373 EAL: No shared files mode enabled, IPC is disabled 00:08:24.373 EAL: Heap on socket 0 was shrunk by 130MB 00:08:24.373 EAL: Trying to obtain current memory policy. 00:08:24.373 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.373 EAL: Restoring previous memory policy: 4 00:08:24.373 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.373 EAL: request: mp_malloc_sync 00:08:24.373 EAL: No shared files mode enabled, IPC is disabled 00:08:24.373 EAL: Heap on socket 0 was expanded by 258MB 00:08:24.373 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.373 EAL: request: mp_malloc_sync 00:08:24.373 EAL: No shared files mode enabled, IPC is disabled 00:08:24.373 EAL: Heap on socket 0 was shrunk by 258MB 00:08:24.373 EAL: Trying to obtain current memory policy. 00:08:24.373 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.631 EAL: Restoring previous memory policy: 4 00:08:24.631 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.631 EAL: request: mp_malloc_sync 00:08:24.631 EAL: No shared files mode enabled, IPC is disabled 00:08:24.631 EAL: Heap on socket 0 was expanded by 514MB 00:08:24.631 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.890 EAL: request: mp_malloc_sync 00:08:24.890 EAL: No shared files mode enabled, IPC is disabled 00:08:24.890 EAL: Heap on socket 0 was shrunk by 514MB 00:08:24.890 EAL: Trying to obtain current memory policy. 00:08:24.890 EAL: Setting policy MPOL_PREFERRED for socket 0 00:08:24.890 EAL: Restoring previous memory policy: 4 00:08:24.890 EAL: Calling mem event callback 'spdk:(nil)' 00:08:24.890 EAL: request: mp_malloc_sync 00:08:24.890 EAL: No shared files mode enabled, IPC is disabled 00:08:24.890 EAL: Heap on socket 0 was expanded by 1026MB 00:08:25.148 EAL: Calling mem event callback 'spdk:(nil)' 00:08:25.407 EAL: request: mp_malloc_sync 00:08:25.407 EAL: No shared files mode enabled, IPC is disabled 00:08:25.407 EAL: Heap on socket 0 was shrunk by 1026MB 00:08:25.407 passed 00:08:25.407 00:08:25.407 Run Summary: Type Total Ran Passed Failed Inactive 00:08:25.407 suites 1 1 n/a 0 0 00:08:25.407 tests 2 2 2 0 0 00:08:25.407 asserts 6303 6303 6303 0 n/a 00:08:25.407 00:08:25.407 Elapsed time = 1.197 seconds 00:08:25.407 EAL: No shared files mode enabled, IPC is disabled 00:08:25.407 EAL: No shared files mode enabled, IPC is disabled 00:08:25.407 EAL: No shared files mode enabled, IPC is disabled 00:08:25.407 00:08:25.407 real 0m1.394s 00:08:25.407 user 0m0.784s 00:08:25.407 sys 0m0.580s 00:08:25.407 05:36:40 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:25.407 05:36:40 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:08:25.407 ************************************ 00:08:25.407 END TEST env_vtophys 00:08:25.407 ************************************ 00:08:25.407 05:36:40 env -- common/autotest_common.sh@1142 -- # return 0 00:08:25.407 05:36:40 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:08:25.407 05:36:40 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:25.407 05:36:40 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:25.407 05:36:40 env -- common/autotest_common.sh@10 -- # set +x 00:08:25.407 ************************************ 00:08:25.407 START TEST env_pci 00:08:25.407 ************************************ 00:08:25.407 05:36:40 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:08:25.407 00:08:25.407 00:08:25.407 CUnit - A unit testing framework for C - Version 2.1-3 00:08:25.407 http://cunit.sourceforge.net/ 00:08:25.407 00:08:25.407 00:08:25.407 Suite: pci 00:08:25.407 Test: pci_hook ...[2024-07-26 05:36:40.275544] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1086594 has claimed it 00:08:25.407 EAL: Cannot find device (10000:00:01.0) 00:08:25.407 EAL: Failed to attach device on primary process 00:08:25.407 passed 00:08:25.407 00:08:25.407 Run Summary: Type Total Ran Passed Failed Inactive 00:08:25.407 suites 1 1 n/a 0 0 00:08:25.407 tests 1 1 1 0 0 00:08:25.407 asserts 25 25 25 0 n/a 00:08:25.407 00:08:25.407 Elapsed time = 0.035 seconds 00:08:25.667 00:08:25.667 real 0m0.063s 00:08:25.668 user 0m0.019s 00:08:25.668 sys 0m0.044s 00:08:25.668 05:36:40 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:25.668 05:36:40 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:08:25.668 ************************************ 00:08:25.668 END TEST env_pci 00:08:25.668 ************************************ 00:08:25.668 05:36:40 env -- common/autotest_common.sh@1142 -- # return 0 00:08:25.668 05:36:40 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:08:25.668 05:36:40 env -- env/env.sh@15 -- # uname 00:08:25.668 05:36:40 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:08:25.668 05:36:40 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:08:25.668 05:36:40 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:08:25.668 05:36:40 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:25.668 05:36:40 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:25.668 05:36:40 env -- common/autotest_common.sh@10 -- # set +x 00:08:25.668 ************************************ 00:08:25.668 START TEST env_dpdk_post_init 00:08:25.668 ************************************ 00:08:25.668 05:36:40 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:08:25.668 EAL: Detected CPU lcores: 72 00:08:25.668 EAL: Detected NUMA nodes: 2 00:08:25.668 EAL: Detected shared linkage of DPDK 00:08:25.668 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:08:25.668 EAL: Selected IOVA mode 'PA' 00:08:25.668 EAL: VFIO support initialized 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:08:25.668 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.668 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:08:25.668 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:08:25.669 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:08:25.669 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:08:25.669 TELEMETRY: No legacy callbacks, legacy socket not created 00:08:25.928 EAL: Using IOMMU type 1 (Type 1) 00:08:25.928 EAL: Ignore mapping IO port bar(1) 00:08:25.928 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:08:25.928 EAL: Ignore mapping IO port bar(1) 00:08:25.928 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:08:25.928 EAL: Ignore mapping IO port bar(1) 00:08:25.928 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:08:25.928 EAL: Ignore mapping IO port bar(1) 00:08:25.928 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:08:25.928 EAL: Ignore mapping IO port bar(1) 00:08:25.928 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:08:25.928 EAL: Ignore mapping IO port bar(1) 00:08:25.928 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:08:25.928 EAL: Ignore mapping IO port bar(1) 00:08:25.928 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:08:25.928 EAL: Ignore mapping IO port bar(1) 00:08:25.928 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:08:26.187 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Ignore mapping IO port bar(5) 00:08:26.187 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:08:26.187 EAL: Ignore mapping IO port bar(1) 00:08:26.187 EAL: Ignore mapping IO port bar(5) 00:08:26.187 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:08:28.717 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:08:28.717 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:08:28.977 Starting DPDK initialization... 00:08:28.977 Starting SPDK post initialization... 00:08:28.977 SPDK NVMe probe 00:08:28.977 Attaching to 0000:5e:00.0 00:08:28.977 Attached to 0000:5e:00.0 00:08:28.977 Cleaning up... 00:08:28.977 00:08:28.977 real 0m3.259s 00:08:28.977 user 0m2.210s 00:08:28.977 sys 0m0.608s 00:08:28.977 05:36:43 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:28.977 05:36:43 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:08:28.977 ************************************ 00:08:28.977 END TEST env_dpdk_post_init 00:08:28.977 ************************************ 00:08:28.977 05:36:43 env -- common/autotest_common.sh@1142 -- # return 0 00:08:28.977 05:36:43 env -- env/env.sh@26 -- # uname 00:08:28.977 05:36:43 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:08:28.977 05:36:43 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:08:28.977 05:36:43 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:28.977 05:36:43 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:28.977 05:36:43 env -- common/autotest_common.sh@10 -- # set +x 00:08:28.977 ************************************ 00:08:28.977 START TEST env_mem_callbacks 00:08:28.977 ************************************ 00:08:28.977 05:36:43 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:08:28.977 EAL: Detected CPU lcores: 72 00:08:28.977 EAL: Detected NUMA nodes: 2 00:08:28.977 EAL: Detected shared linkage of DPDK 00:08:28.977 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:08:28.977 EAL: Selected IOVA mode 'PA' 00:08:28.977 EAL: VFIO support initialized 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:08:28.977 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.977 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:08:28.977 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.978 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.978 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:08:28.978 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.979 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:08:28.979 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:08:28.979 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:08:28.979 TELEMETRY: No legacy callbacks, legacy socket not created 00:08:28.979 00:08:28.979 00:08:28.979 CUnit - A unit testing framework for C - Version 2.1-3 00:08:28.979 http://cunit.sourceforge.net/ 00:08:28.979 00:08:28.979 00:08:28.979 Suite: memory 00:08:28.979 Test: test ... 00:08:28.979 register 0x200000200000 2097152 00:08:28.979 register 0x201000a00000 2097152 00:08:28.979 malloc 3145728 00:08:28.979 register 0x200000400000 4194304 00:08:28.979 buf 0x200000500000 len 3145728 PASSED 00:08:28.979 malloc 64 00:08:28.979 buf 0x2000004fff40 len 64 PASSED 00:08:28.979 malloc 4194304 00:08:28.979 register 0x200000800000 6291456 00:08:28.979 buf 0x200000a00000 len 4194304 PASSED 00:08:28.979 free 0x200000500000 3145728 00:08:28.979 free 0x2000004fff40 64 00:08:28.979 unregister 0x200000400000 4194304 PASSED 00:08:28.979 free 0x200000a00000 4194304 00:08:28.979 unregister 0x200000800000 6291456 PASSED 00:08:28.979 malloc 8388608 00:08:28.979 register 0x200000400000 10485760 00:08:28.979 buf 0x200000600000 len 8388608 PASSED 00:08:28.979 free 0x200000600000 8388608 00:08:28.979 unregister 0x200000400000 10485760 PASSED 00:08:28.979 passed 00:08:28.979 00:08:28.979 Run Summary: Type Total Ran Passed Failed Inactive 00:08:28.979 suites 1 1 n/a 0 0 00:08:28.979 tests 1 1 1 0 0 00:08:28.979 asserts 16 16 16 0 n/a 00:08:28.979 00:08:28.979 Elapsed time = 0.007 seconds 00:08:28.979 00:08:28.979 real 0m0.111s 00:08:28.979 user 0m0.025s 00:08:28.979 sys 0m0.086s 00:08:28.979 05:36:43 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:28.979 05:36:43 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:08:28.979 ************************************ 00:08:28.979 END TEST env_mem_callbacks 00:08:28.979 ************************************ 00:08:29.238 05:36:43 env -- common/autotest_common.sh@1142 -- # return 0 00:08:29.238 00:08:29.238 real 0m5.478s 00:08:29.238 user 0m3.354s 00:08:29.238 sys 0m1.690s 00:08:29.238 05:36:43 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.238 05:36:43 env -- common/autotest_common.sh@10 -- # set +x 00:08:29.238 ************************************ 00:08:29.238 END TEST env 00:08:29.238 ************************************ 00:08:29.238 05:36:43 -- common/autotest_common.sh@1142 -- # return 0 00:08:29.238 05:36:43 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:08:29.238 05:36:43 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:29.238 05:36:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:29.238 05:36:43 -- common/autotest_common.sh@10 -- # set +x 00:08:29.238 ************************************ 00:08:29.238 START TEST rpc 00:08:29.238 ************************************ 00:08:29.238 05:36:43 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:08:29.238 * Looking for test storage... 00:08:29.238 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:08:29.238 05:36:44 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1087241 00:08:29.238 05:36:44 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:08:29.238 05:36:44 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:08:29.238 05:36:44 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1087241 00:08:29.238 05:36:44 rpc -- common/autotest_common.sh@829 -- # '[' -z 1087241 ']' 00:08:29.238 05:36:44 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:29.238 05:36:44 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:29.238 05:36:44 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:29.238 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:29.238 05:36:44 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:29.238 05:36:44 rpc -- common/autotest_common.sh@10 -- # set +x 00:08:29.497 [2024-07-26 05:36:44.152729] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:08:29.497 [2024-07-26 05:36:44.152788] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1087241 ] 00:08:29.497 [2024-07-26 05:36:44.263088] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.497 [2024-07-26 05:36:44.360435] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:08:29.497 [2024-07-26 05:36:44.360489] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1087241' to capture a snapshot of events at runtime. 00:08:29.497 [2024-07-26 05:36:44.360503] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:08:29.497 [2024-07-26 05:36:44.360516] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:08:29.497 [2024-07-26 05:36:44.360526] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1087241 for offline analysis/debug. 00:08:29.497 [2024-07-26 05:36:44.360559] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.432 05:36:45 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:30.432 05:36:45 rpc -- common/autotest_common.sh@862 -- # return 0 00:08:30.432 05:36:45 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:08:30.432 05:36:45 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:08:30.432 05:36:45 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:08:30.432 05:36:45 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:08:30.432 05:36:45 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:30.432 05:36:45 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.432 05:36:45 rpc -- common/autotest_common.sh@10 -- # set +x 00:08:30.432 ************************************ 00:08:30.432 START TEST rpc_integrity 00:08:30.432 ************************************ 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:08:30.432 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.432 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:08:30.432 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:08:30.432 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:08:30.432 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.432 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:08:30.432 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:30.432 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.432 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:08:30.432 { 00:08:30.432 "name": "Malloc0", 00:08:30.432 "aliases": [ 00:08:30.432 "6a85145a-ffad-41b1-9ef9-46a5384fc354" 00:08:30.432 ], 00:08:30.432 "product_name": "Malloc disk", 00:08:30.432 "block_size": 512, 00:08:30.432 "num_blocks": 16384, 00:08:30.432 "uuid": "6a85145a-ffad-41b1-9ef9-46a5384fc354", 00:08:30.432 "assigned_rate_limits": { 00:08:30.432 "rw_ios_per_sec": 0, 00:08:30.432 "rw_mbytes_per_sec": 0, 00:08:30.432 "r_mbytes_per_sec": 0, 00:08:30.432 "w_mbytes_per_sec": 0 00:08:30.432 }, 00:08:30.432 "claimed": false, 00:08:30.432 "zoned": false, 00:08:30.432 "supported_io_types": { 00:08:30.432 "read": true, 00:08:30.432 "write": true, 00:08:30.432 "unmap": true, 00:08:30.432 "flush": true, 00:08:30.432 "reset": true, 00:08:30.432 "nvme_admin": false, 00:08:30.432 "nvme_io": false, 00:08:30.432 "nvme_io_md": false, 00:08:30.432 "write_zeroes": true, 00:08:30.432 "zcopy": true, 00:08:30.432 "get_zone_info": false, 00:08:30.432 "zone_management": false, 00:08:30.432 "zone_append": false, 00:08:30.432 "compare": false, 00:08:30.432 "compare_and_write": false, 00:08:30.432 "abort": true, 00:08:30.432 "seek_hole": false, 00:08:30.432 "seek_data": false, 00:08:30.432 "copy": true, 00:08:30.432 "nvme_iov_md": false 00:08:30.432 }, 00:08:30.432 "memory_domains": [ 00:08:30.432 { 00:08:30.432 "dma_device_id": "system", 00:08:30.433 "dma_device_type": 1 00:08:30.433 }, 00:08:30.433 { 00:08:30.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:30.433 "dma_device_type": 2 00:08:30.433 } 00:08:30.433 ], 00:08:30.433 "driver_specific": {} 00:08:30.433 } 00:08:30.433 ]' 00:08:30.433 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:08:30.433 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:08:30.433 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:08:30.433 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.433 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:30.433 [2024-07-26 05:36:45.267899] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:08:30.433 [2024-07-26 05:36:45.267943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:30.433 [2024-07-26 05:36:45.267963] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2781eb0 00:08:30.433 [2024-07-26 05:36:45.267976] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:30.433 [2024-07-26 05:36:45.269469] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:30.433 [2024-07-26 05:36:45.269500] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:08:30.433 Passthru0 00:08:30.433 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.433 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:08:30.433 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.433 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:30.433 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.433 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:08:30.433 { 00:08:30.433 "name": "Malloc0", 00:08:30.433 "aliases": [ 00:08:30.433 "6a85145a-ffad-41b1-9ef9-46a5384fc354" 00:08:30.433 ], 00:08:30.433 "product_name": "Malloc disk", 00:08:30.433 "block_size": 512, 00:08:30.433 "num_blocks": 16384, 00:08:30.433 "uuid": "6a85145a-ffad-41b1-9ef9-46a5384fc354", 00:08:30.433 "assigned_rate_limits": { 00:08:30.433 "rw_ios_per_sec": 0, 00:08:30.433 "rw_mbytes_per_sec": 0, 00:08:30.433 "r_mbytes_per_sec": 0, 00:08:30.433 "w_mbytes_per_sec": 0 00:08:30.433 }, 00:08:30.433 "claimed": true, 00:08:30.433 "claim_type": "exclusive_write", 00:08:30.433 "zoned": false, 00:08:30.433 "supported_io_types": { 00:08:30.433 "read": true, 00:08:30.433 "write": true, 00:08:30.433 "unmap": true, 00:08:30.433 "flush": true, 00:08:30.433 "reset": true, 00:08:30.433 "nvme_admin": false, 00:08:30.433 "nvme_io": false, 00:08:30.433 "nvme_io_md": false, 00:08:30.433 "write_zeroes": true, 00:08:30.433 "zcopy": true, 00:08:30.433 "get_zone_info": false, 00:08:30.433 "zone_management": false, 00:08:30.433 "zone_append": false, 00:08:30.433 "compare": false, 00:08:30.433 "compare_and_write": false, 00:08:30.433 "abort": true, 00:08:30.433 "seek_hole": false, 00:08:30.433 "seek_data": false, 00:08:30.433 "copy": true, 00:08:30.433 "nvme_iov_md": false 00:08:30.433 }, 00:08:30.433 "memory_domains": [ 00:08:30.433 { 00:08:30.433 "dma_device_id": "system", 00:08:30.433 "dma_device_type": 1 00:08:30.433 }, 00:08:30.433 { 00:08:30.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:30.433 "dma_device_type": 2 00:08:30.433 } 00:08:30.433 ], 00:08:30.433 "driver_specific": {} 00:08:30.433 }, 00:08:30.433 { 00:08:30.433 "name": "Passthru0", 00:08:30.433 "aliases": [ 00:08:30.433 "26ee8dd9-b026-572f-9cea-274da1da7ef9" 00:08:30.433 ], 00:08:30.433 "product_name": "passthru", 00:08:30.433 "block_size": 512, 00:08:30.433 "num_blocks": 16384, 00:08:30.433 "uuid": "26ee8dd9-b026-572f-9cea-274da1da7ef9", 00:08:30.433 "assigned_rate_limits": { 00:08:30.433 "rw_ios_per_sec": 0, 00:08:30.433 "rw_mbytes_per_sec": 0, 00:08:30.433 "r_mbytes_per_sec": 0, 00:08:30.433 "w_mbytes_per_sec": 0 00:08:30.433 }, 00:08:30.433 "claimed": false, 00:08:30.433 "zoned": false, 00:08:30.433 "supported_io_types": { 00:08:30.433 "read": true, 00:08:30.433 "write": true, 00:08:30.433 "unmap": true, 00:08:30.433 "flush": true, 00:08:30.433 "reset": true, 00:08:30.433 "nvme_admin": false, 00:08:30.433 "nvme_io": false, 00:08:30.433 "nvme_io_md": false, 00:08:30.433 "write_zeroes": true, 00:08:30.433 "zcopy": true, 00:08:30.433 "get_zone_info": false, 00:08:30.433 "zone_management": false, 00:08:30.433 "zone_append": false, 00:08:30.433 "compare": false, 00:08:30.433 "compare_and_write": false, 00:08:30.433 "abort": true, 00:08:30.433 "seek_hole": false, 00:08:30.433 "seek_data": false, 00:08:30.433 "copy": true, 00:08:30.433 "nvme_iov_md": false 00:08:30.433 }, 00:08:30.433 "memory_domains": [ 00:08:30.433 { 00:08:30.433 "dma_device_id": "system", 00:08:30.433 "dma_device_type": 1 00:08:30.433 }, 00:08:30.433 { 00:08:30.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:30.433 "dma_device_type": 2 00:08:30.433 } 00:08:30.433 ], 00:08:30.433 "driver_specific": { 00:08:30.433 "passthru": { 00:08:30.433 "name": "Passthru0", 00:08:30.433 "base_bdev_name": "Malloc0" 00:08:30.433 } 00:08:30.433 } 00:08:30.433 } 00:08:30.433 ]' 00:08:30.433 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:08:30.692 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:08:30.692 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.692 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.692 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.692 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:08:30.692 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:08:30.692 05:36:45 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:08:30.692 00:08:30.692 real 0m0.302s 00:08:30.692 user 0m0.194s 00:08:30.692 sys 0m0.050s 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:30.692 05:36:45 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:30.692 ************************************ 00:08:30.692 END TEST rpc_integrity 00:08:30.692 ************************************ 00:08:30.692 05:36:45 rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:30.692 05:36:45 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:08:30.693 05:36:45 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:30.693 05:36:45 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.693 05:36:45 rpc -- common/autotest_common.sh@10 -- # set +x 00:08:30.693 ************************************ 00:08:30.693 START TEST rpc_plugins 00:08:30.693 ************************************ 00:08:30.693 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:08:30.693 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:08:30.693 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.693 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:08:30.693 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.693 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:08:30.693 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:08:30.693 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.693 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:08:30.693 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.693 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:08:30.693 { 00:08:30.693 "name": "Malloc1", 00:08:30.693 "aliases": [ 00:08:30.693 "ca359e21-7ec4-41e7-bfda-524816a5dc71" 00:08:30.693 ], 00:08:30.693 "product_name": "Malloc disk", 00:08:30.693 "block_size": 4096, 00:08:30.693 "num_blocks": 256, 00:08:30.693 "uuid": "ca359e21-7ec4-41e7-bfda-524816a5dc71", 00:08:30.693 "assigned_rate_limits": { 00:08:30.693 "rw_ios_per_sec": 0, 00:08:30.693 "rw_mbytes_per_sec": 0, 00:08:30.693 "r_mbytes_per_sec": 0, 00:08:30.693 "w_mbytes_per_sec": 0 00:08:30.693 }, 00:08:30.693 "claimed": false, 00:08:30.693 "zoned": false, 00:08:30.693 "supported_io_types": { 00:08:30.693 "read": true, 00:08:30.693 "write": true, 00:08:30.693 "unmap": true, 00:08:30.693 "flush": true, 00:08:30.693 "reset": true, 00:08:30.693 "nvme_admin": false, 00:08:30.693 "nvme_io": false, 00:08:30.693 "nvme_io_md": false, 00:08:30.693 "write_zeroes": true, 00:08:30.693 "zcopy": true, 00:08:30.693 "get_zone_info": false, 00:08:30.693 "zone_management": false, 00:08:30.693 "zone_append": false, 00:08:30.693 "compare": false, 00:08:30.693 "compare_and_write": false, 00:08:30.693 "abort": true, 00:08:30.693 "seek_hole": false, 00:08:30.693 "seek_data": false, 00:08:30.693 "copy": true, 00:08:30.693 "nvme_iov_md": false 00:08:30.693 }, 00:08:30.693 "memory_domains": [ 00:08:30.693 { 00:08:30.693 "dma_device_id": "system", 00:08:30.693 "dma_device_type": 1 00:08:30.693 }, 00:08:30.693 { 00:08:30.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:30.693 "dma_device_type": 2 00:08:30.693 } 00:08:30.693 ], 00:08:30.693 "driver_specific": {} 00:08:30.693 } 00:08:30.693 ]' 00:08:30.693 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:08:30.693 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:08:30.693 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:08:30.693 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.693 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:08:30.951 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.951 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:08:30.951 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.951 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:08:30.951 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.951 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:08:30.951 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:08:30.951 05:36:45 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:08:30.951 00:08:30.951 real 0m0.150s 00:08:30.951 user 0m0.092s 00:08:30.951 sys 0m0.030s 00:08:30.951 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:30.951 05:36:45 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:08:30.951 ************************************ 00:08:30.951 END TEST rpc_plugins 00:08:30.951 ************************************ 00:08:30.951 05:36:45 rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:30.951 05:36:45 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:08:30.952 05:36:45 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:30.952 05:36:45 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.952 05:36:45 rpc -- common/autotest_common.sh@10 -- # set +x 00:08:30.952 ************************************ 00:08:30.952 START TEST rpc_trace_cmd_test 00:08:30.952 ************************************ 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:08:30.952 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1087241", 00:08:30.952 "tpoint_group_mask": "0x8", 00:08:30.952 "iscsi_conn": { 00:08:30.952 "mask": "0x2", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "scsi": { 00:08:30.952 "mask": "0x4", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "bdev": { 00:08:30.952 "mask": "0x8", 00:08:30.952 "tpoint_mask": "0xffffffffffffffff" 00:08:30.952 }, 00:08:30.952 "nvmf_rdma": { 00:08:30.952 "mask": "0x10", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "nvmf_tcp": { 00:08:30.952 "mask": "0x20", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "ftl": { 00:08:30.952 "mask": "0x40", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "blobfs": { 00:08:30.952 "mask": "0x80", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "dsa": { 00:08:30.952 "mask": "0x200", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "thread": { 00:08:30.952 "mask": "0x400", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "nvme_pcie": { 00:08:30.952 "mask": "0x800", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "iaa": { 00:08:30.952 "mask": "0x1000", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "nvme_tcp": { 00:08:30.952 "mask": "0x2000", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "bdev_nvme": { 00:08:30.952 "mask": "0x4000", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 }, 00:08:30.952 "sock": { 00:08:30.952 "mask": "0x8000", 00:08:30.952 "tpoint_mask": "0x0" 00:08:30.952 } 00:08:30.952 }' 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:08:30.952 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:08:31.210 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:08:31.210 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:08:31.210 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:08:31.210 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:08:31.210 05:36:45 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:08:31.210 00:08:31.210 real 0m0.244s 00:08:31.210 user 0m0.196s 00:08:31.210 sys 0m0.039s 00:08:31.210 05:36:45 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.210 05:36:45 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:08:31.210 ************************************ 00:08:31.210 END TEST rpc_trace_cmd_test 00:08:31.210 ************************************ 00:08:31.210 05:36:46 rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:31.210 05:36:46 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:08:31.210 05:36:46 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:08:31.210 05:36:46 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:08:31.210 05:36:46 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.210 05:36:46 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.210 05:36:46 rpc -- common/autotest_common.sh@10 -- # set +x 00:08:31.210 ************************************ 00:08:31.210 START TEST rpc_daemon_integrity 00:08:31.210 ************************************ 00:08:31.210 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:08:31.210 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:08:31.210 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.210 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:31.210 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.210 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:08:31.210 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:08:31.469 { 00:08:31.469 "name": "Malloc2", 00:08:31.469 "aliases": [ 00:08:31.469 "6f74066c-4cd6-49e7-b314-50f2d9cb28f9" 00:08:31.469 ], 00:08:31.469 "product_name": "Malloc disk", 00:08:31.469 "block_size": 512, 00:08:31.469 "num_blocks": 16384, 00:08:31.469 "uuid": "6f74066c-4cd6-49e7-b314-50f2d9cb28f9", 00:08:31.469 "assigned_rate_limits": { 00:08:31.469 "rw_ios_per_sec": 0, 00:08:31.469 "rw_mbytes_per_sec": 0, 00:08:31.469 "r_mbytes_per_sec": 0, 00:08:31.469 "w_mbytes_per_sec": 0 00:08:31.469 }, 00:08:31.469 "claimed": false, 00:08:31.469 "zoned": false, 00:08:31.469 "supported_io_types": { 00:08:31.469 "read": true, 00:08:31.469 "write": true, 00:08:31.469 "unmap": true, 00:08:31.469 "flush": true, 00:08:31.469 "reset": true, 00:08:31.469 "nvme_admin": false, 00:08:31.469 "nvme_io": false, 00:08:31.469 "nvme_io_md": false, 00:08:31.469 "write_zeroes": true, 00:08:31.469 "zcopy": true, 00:08:31.469 "get_zone_info": false, 00:08:31.469 "zone_management": false, 00:08:31.469 "zone_append": false, 00:08:31.469 "compare": false, 00:08:31.469 "compare_and_write": false, 00:08:31.469 "abort": true, 00:08:31.469 "seek_hole": false, 00:08:31.469 "seek_data": false, 00:08:31.469 "copy": true, 00:08:31.469 "nvme_iov_md": false 00:08:31.469 }, 00:08:31.469 "memory_domains": [ 00:08:31.469 { 00:08:31.469 "dma_device_id": "system", 00:08:31.469 "dma_device_type": 1 00:08:31.469 }, 00:08:31.469 { 00:08:31.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:31.469 "dma_device_type": 2 00:08:31.469 } 00:08:31.469 ], 00:08:31.469 "driver_specific": {} 00:08:31.469 } 00:08:31.469 ]' 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:31.469 [2024-07-26 05:36:46.210576] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:08:31.469 [2024-07-26 05:36:46.210615] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:31.469 [2024-07-26 05:36:46.210634] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2782b20 00:08:31.469 [2024-07-26 05:36:46.210653] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:31.469 [2024-07-26 05:36:46.212010] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:31.469 [2024-07-26 05:36:46.212038] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:08:31.469 Passthru0 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.469 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:08:31.469 { 00:08:31.469 "name": "Malloc2", 00:08:31.469 "aliases": [ 00:08:31.469 "6f74066c-4cd6-49e7-b314-50f2d9cb28f9" 00:08:31.469 ], 00:08:31.469 "product_name": "Malloc disk", 00:08:31.469 "block_size": 512, 00:08:31.469 "num_blocks": 16384, 00:08:31.469 "uuid": "6f74066c-4cd6-49e7-b314-50f2d9cb28f9", 00:08:31.469 "assigned_rate_limits": { 00:08:31.469 "rw_ios_per_sec": 0, 00:08:31.469 "rw_mbytes_per_sec": 0, 00:08:31.469 "r_mbytes_per_sec": 0, 00:08:31.469 "w_mbytes_per_sec": 0 00:08:31.469 }, 00:08:31.469 "claimed": true, 00:08:31.469 "claim_type": "exclusive_write", 00:08:31.469 "zoned": false, 00:08:31.469 "supported_io_types": { 00:08:31.469 "read": true, 00:08:31.469 "write": true, 00:08:31.469 "unmap": true, 00:08:31.469 "flush": true, 00:08:31.469 "reset": true, 00:08:31.469 "nvme_admin": false, 00:08:31.469 "nvme_io": false, 00:08:31.469 "nvme_io_md": false, 00:08:31.469 "write_zeroes": true, 00:08:31.469 "zcopy": true, 00:08:31.469 "get_zone_info": false, 00:08:31.469 "zone_management": false, 00:08:31.469 "zone_append": false, 00:08:31.469 "compare": false, 00:08:31.469 "compare_and_write": false, 00:08:31.469 "abort": true, 00:08:31.469 "seek_hole": false, 00:08:31.469 "seek_data": false, 00:08:31.469 "copy": true, 00:08:31.469 "nvme_iov_md": false 00:08:31.469 }, 00:08:31.469 "memory_domains": [ 00:08:31.469 { 00:08:31.469 "dma_device_id": "system", 00:08:31.469 "dma_device_type": 1 00:08:31.469 }, 00:08:31.469 { 00:08:31.469 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:31.470 "dma_device_type": 2 00:08:31.470 } 00:08:31.470 ], 00:08:31.470 "driver_specific": {} 00:08:31.470 }, 00:08:31.470 { 00:08:31.470 "name": "Passthru0", 00:08:31.470 "aliases": [ 00:08:31.470 "fee09101-991c-5eb7-af81-8a3a01986b7b" 00:08:31.470 ], 00:08:31.470 "product_name": "passthru", 00:08:31.470 "block_size": 512, 00:08:31.470 "num_blocks": 16384, 00:08:31.470 "uuid": "fee09101-991c-5eb7-af81-8a3a01986b7b", 00:08:31.470 "assigned_rate_limits": { 00:08:31.470 "rw_ios_per_sec": 0, 00:08:31.470 "rw_mbytes_per_sec": 0, 00:08:31.470 "r_mbytes_per_sec": 0, 00:08:31.470 "w_mbytes_per_sec": 0 00:08:31.470 }, 00:08:31.470 "claimed": false, 00:08:31.470 "zoned": false, 00:08:31.470 "supported_io_types": { 00:08:31.470 "read": true, 00:08:31.470 "write": true, 00:08:31.470 "unmap": true, 00:08:31.470 "flush": true, 00:08:31.470 "reset": true, 00:08:31.470 "nvme_admin": false, 00:08:31.470 "nvme_io": false, 00:08:31.470 "nvme_io_md": false, 00:08:31.470 "write_zeroes": true, 00:08:31.470 "zcopy": true, 00:08:31.470 "get_zone_info": false, 00:08:31.470 "zone_management": false, 00:08:31.470 "zone_append": false, 00:08:31.470 "compare": false, 00:08:31.470 "compare_and_write": false, 00:08:31.470 "abort": true, 00:08:31.470 "seek_hole": false, 00:08:31.470 "seek_data": false, 00:08:31.470 "copy": true, 00:08:31.470 "nvme_iov_md": false 00:08:31.470 }, 00:08:31.470 "memory_domains": [ 00:08:31.470 { 00:08:31.470 "dma_device_id": "system", 00:08:31.470 "dma_device_type": 1 00:08:31.470 }, 00:08:31.470 { 00:08:31.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:31.470 "dma_device_type": 2 00:08:31.470 } 00:08:31.470 ], 00:08:31.470 "driver_specific": { 00:08:31.470 "passthru": { 00:08:31.470 "name": "Passthru0", 00:08:31.470 "base_bdev_name": "Malloc2" 00:08:31.470 } 00:08:31.470 } 00:08:31.470 } 00:08:31.470 ]' 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:08:31.470 00:08:31.470 real 0m0.288s 00:08:31.470 user 0m0.189s 00:08:31.470 sys 0m0.051s 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.470 05:36:46 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:08:31.470 ************************************ 00:08:31.470 END TEST rpc_daemon_integrity 00:08:31.470 ************************************ 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:31.729 05:36:46 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:08:31.729 05:36:46 rpc -- rpc/rpc.sh@84 -- # killprocess 1087241 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@948 -- # '[' -z 1087241 ']' 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@952 -- # kill -0 1087241 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@953 -- # uname 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1087241 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1087241' 00:08:31.729 killing process with pid 1087241 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@967 -- # kill 1087241 00:08:31.729 05:36:46 rpc -- common/autotest_common.sh@972 -- # wait 1087241 00:08:31.987 00:08:31.987 real 0m2.860s 00:08:31.987 user 0m3.639s 00:08:31.987 sys 0m0.936s 00:08:31.987 05:36:46 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.987 05:36:46 rpc -- common/autotest_common.sh@10 -- # set +x 00:08:31.987 ************************************ 00:08:31.987 END TEST rpc 00:08:31.987 ************************************ 00:08:31.987 05:36:46 -- common/autotest_common.sh@1142 -- # return 0 00:08:31.987 05:36:46 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:08:31.987 05:36:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.987 05:36:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.987 05:36:46 -- common/autotest_common.sh@10 -- # set +x 00:08:32.246 ************************************ 00:08:32.246 START TEST skip_rpc 00:08:32.246 ************************************ 00:08:32.246 05:36:46 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:08:32.246 * Looking for test storage... 00:08:32.246 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:08:32.246 05:36:47 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:08:32.246 05:36:47 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:08:32.246 05:36:47 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:08:32.246 05:36:47 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:32.246 05:36:47 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:32.246 05:36:47 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:32.246 ************************************ 00:08:32.246 START TEST skip_rpc 00:08:32.246 ************************************ 00:08:32.246 05:36:47 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:08:32.246 05:36:47 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1087771 00:08:32.246 05:36:47 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:08:32.246 05:36:47 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:08:32.246 05:36:47 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:08:32.505 [2024-07-26 05:36:47.153793] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:08:32.505 [2024-07-26 05:36:47.153864] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1087771 ] 00:08:32.505 [2024-07-26 05:36:47.285124] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.505 [2024-07-26 05:36:47.382239] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1087771 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 1087771 ']' 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 1087771 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1087771 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1087771' 00:08:37.776 killing process with pid 1087771 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 1087771 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 1087771 00:08:37.776 00:08:37.776 real 0m5.417s 00:08:37.776 user 0m5.078s 00:08:37.776 sys 0m0.357s 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:37.776 05:36:52 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:37.776 ************************************ 00:08:37.776 END TEST skip_rpc 00:08:37.776 ************************************ 00:08:37.776 05:36:52 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:37.776 05:36:52 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:08:37.776 05:36:52 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:37.776 05:36:52 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.776 05:36:52 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:37.776 ************************************ 00:08:37.776 START TEST skip_rpc_with_json 00:08:37.776 ************************************ 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1088501 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1088501 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 1088501 ']' 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:37.776 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:08:37.776 05:36:52 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:08:37.776 [2024-07-26 05:36:52.646189] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:08:37.776 [2024-07-26 05:36:52.646256] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1088501 ] 00:08:38.035 [2024-07-26 05:36:52.773976] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.035 [2024-07-26 05:36:52.881687] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:08:38.992 [2024-07-26 05:36:53.563997] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:08:38.992 request: 00:08:38.992 { 00:08:38.992 "trtype": "tcp", 00:08:38.992 "method": "nvmf_get_transports", 00:08:38.992 "req_id": 1 00:08:38.992 } 00:08:38.992 Got JSON-RPC error response 00:08:38.992 response: 00:08:38.992 { 00:08:38.992 "code": -19, 00:08:38.992 "message": "No such device" 00:08:38.992 } 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:08:38.992 [2024-07-26 05:36:53.572131] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:08:38.992 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.993 05:36:53 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:08:38.993 { 00:08:38.993 "subsystems": [ 00:08:38.993 { 00:08:38.993 "subsystem": "keyring", 00:08:38.993 "config": [] 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "subsystem": "iobuf", 00:08:38.993 "config": [ 00:08:38.993 { 00:08:38.993 "method": "iobuf_set_options", 00:08:38.993 "params": { 00:08:38.993 "small_pool_count": 8192, 00:08:38.993 "large_pool_count": 1024, 00:08:38.993 "small_bufsize": 8192, 00:08:38.993 "large_bufsize": 135168 00:08:38.993 } 00:08:38.993 } 00:08:38.993 ] 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "subsystem": "sock", 00:08:38.993 "config": [ 00:08:38.993 { 00:08:38.993 "method": "sock_set_default_impl", 00:08:38.993 "params": { 00:08:38.993 "impl_name": "posix" 00:08:38.993 } 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "method": "sock_impl_set_options", 00:08:38.993 "params": { 00:08:38.993 "impl_name": "ssl", 00:08:38.993 "recv_buf_size": 4096, 00:08:38.993 "send_buf_size": 4096, 00:08:38.993 "enable_recv_pipe": true, 00:08:38.993 "enable_quickack": false, 00:08:38.993 "enable_placement_id": 0, 00:08:38.993 "enable_zerocopy_send_server": true, 00:08:38.993 "enable_zerocopy_send_client": false, 00:08:38.993 "zerocopy_threshold": 0, 00:08:38.993 "tls_version": 0, 00:08:38.993 "enable_ktls": false 00:08:38.993 } 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "method": "sock_impl_set_options", 00:08:38.993 "params": { 00:08:38.993 "impl_name": "posix", 00:08:38.993 "recv_buf_size": 2097152, 00:08:38.993 "send_buf_size": 2097152, 00:08:38.993 "enable_recv_pipe": true, 00:08:38.993 "enable_quickack": false, 00:08:38.993 "enable_placement_id": 0, 00:08:38.993 "enable_zerocopy_send_server": true, 00:08:38.993 "enable_zerocopy_send_client": false, 00:08:38.993 "zerocopy_threshold": 0, 00:08:38.993 "tls_version": 0, 00:08:38.993 "enable_ktls": false 00:08:38.993 } 00:08:38.993 } 00:08:38.993 ] 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "subsystem": "vmd", 00:08:38.993 "config": [] 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "subsystem": "accel", 00:08:38.993 "config": [ 00:08:38.993 { 00:08:38.993 "method": "accel_set_options", 00:08:38.993 "params": { 00:08:38.993 "small_cache_size": 128, 00:08:38.993 "large_cache_size": 16, 00:08:38.993 "task_count": 2048, 00:08:38.993 "sequence_count": 2048, 00:08:38.993 "buf_count": 2048 00:08:38.993 } 00:08:38.993 } 00:08:38.993 ] 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "subsystem": "bdev", 00:08:38.993 "config": [ 00:08:38.993 { 00:08:38.993 "method": "bdev_set_options", 00:08:38.993 "params": { 00:08:38.993 "bdev_io_pool_size": 65535, 00:08:38.993 "bdev_io_cache_size": 256, 00:08:38.993 "bdev_auto_examine": true, 00:08:38.993 "iobuf_small_cache_size": 128, 00:08:38.993 "iobuf_large_cache_size": 16 00:08:38.993 } 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "method": "bdev_raid_set_options", 00:08:38.993 "params": { 00:08:38.993 "process_window_size_kb": 1024, 00:08:38.993 "process_max_bandwidth_mb_sec": 0 00:08:38.993 } 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "method": "bdev_iscsi_set_options", 00:08:38.993 "params": { 00:08:38.993 "timeout_sec": 30 00:08:38.993 } 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "method": "bdev_nvme_set_options", 00:08:38.993 "params": { 00:08:38.993 "action_on_timeout": "none", 00:08:38.993 "timeout_us": 0, 00:08:38.993 "timeout_admin_us": 0, 00:08:38.993 "keep_alive_timeout_ms": 10000, 00:08:38.993 "arbitration_burst": 0, 00:08:38.993 "low_priority_weight": 0, 00:08:38.993 "medium_priority_weight": 0, 00:08:38.993 "high_priority_weight": 0, 00:08:38.993 "nvme_adminq_poll_period_us": 10000, 00:08:38.993 "nvme_ioq_poll_period_us": 0, 00:08:38.993 "io_queue_requests": 0, 00:08:38.993 "delay_cmd_submit": true, 00:08:38.993 "transport_retry_count": 4, 00:08:38.993 "bdev_retry_count": 3, 00:08:38.993 "transport_ack_timeout": 0, 00:08:38.993 "ctrlr_loss_timeout_sec": 0, 00:08:38.993 "reconnect_delay_sec": 0, 00:08:38.993 "fast_io_fail_timeout_sec": 0, 00:08:38.993 "disable_auto_failback": false, 00:08:38.993 "generate_uuids": false, 00:08:38.993 "transport_tos": 0, 00:08:38.993 "nvme_error_stat": false, 00:08:38.993 "rdma_srq_size": 0, 00:08:38.993 "io_path_stat": false, 00:08:38.993 "allow_accel_sequence": false, 00:08:38.993 "rdma_max_cq_size": 0, 00:08:38.993 "rdma_cm_event_timeout_ms": 0, 00:08:38.993 "dhchap_digests": [ 00:08:38.993 "sha256", 00:08:38.993 "sha384", 00:08:38.993 "sha512" 00:08:38.993 ], 00:08:38.993 "dhchap_dhgroups": [ 00:08:38.993 "null", 00:08:38.993 "ffdhe2048", 00:08:38.993 "ffdhe3072", 00:08:38.993 "ffdhe4096", 00:08:38.993 "ffdhe6144", 00:08:38.993 "ffdhe8192" 00:08:38.993 ] 00:08:38.993 } 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "method": "bdev_nvme_set_hotplug", 00:08:38.993 "params": { 00:08:38.993 "period_us": 100000, 00:08:38.993 "enable": false 00:08:38.993 } 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "method": "bdev_wait_for_examine" 00:08:38.993 } 00:08:38.993 ] 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "subsystem": "scsi", 00:08:38.993 "config": null 00:08:38.993 }, 00:08:38.993 { 00:08:38.993 "subsystem": "scheduler", 00:08:38.994 "config": [ 00:08:38.994 { 00:08:38.994 "method": "framework_set_scheduler", 00:08:38.994 "params": { 00:08:38.994 "name": "static" 00:08:38.994 } 00:08:38.994 } 00:08:38.994 ] 00:08:38.994 }, 00:08:38.994 { 00:08:38.994 "subsystem": "vhost_scsi", 00:08:38.994 "config": [] 00:08:38.994 }, 00:08:38.994 { 00:08:38.994 "subsystem": "vhost_blk", 00:08:38.994 "config": [] 00:08:38.994 }, 00:08:38.994 { 00:08:38.994 "subsystem": "ublk", 00:08:38.994 "config": [] 00:08:38.994 }, 00:08:38.994 { 00:08:38.994 "subsystem": "nbd", 00:08:38.994 "config": [] 00:08:38.994 }, 00:08:38.994 { 00:08:38.994 "subsystem": "nvmf", 00:08:38.994 "config": [ 00:08:38.994 { 00:08:38.994 "method": "nvmf_set_config", 00:08:38.994 "params": { 00:08:38.994 "discovery_filter": "match_any", 00:08:38.994 "admin_cmd_passthru": { 00:08:38.994 "identify_ctrlr": false 00:08:38.994 } 00:08:38.994 } 00:08:38.994 }, 00:08:38.994 { 00:08:38.994 "method": "nvmf_set_max_subsystems", 00:08:38.994 "params": { 00:08:38.994 "max_subsystems": 1024 00:08:38.994 } 00:08:38.994 }, 00:08:38.994 { 00:08:38.994 "method": "nvmf_set_crdt", 00:08:38.994 "params": { 00:08:38.994 "crdt1": 0, 00:08:38.994 "crdt2": 0, 00:08:38.994 "crdt3": 0 00:08:38.994 } 00:08:38.994 }, 00:08:38.994 { 00:08:38.994 "method": "nvmf_create_transport", 00:08:38.994 "params": { 00:08:38.994 "trtype": "TCP", 00:08:38.994 "max_queue_depth": 128, 00:08:38.994 "max_io_qpairs_per_ctrlr": 127, 00:08:38.994 "in_capsule_data_size": 4096, 00:08:38.994 "max_io_size": 131072, 00:08:38.994 "io_unit_size": 131072, 00:08:38.994 "max_aq_depth": 128, 00:08:38.994 "num_shared_buffers": 511, 00:08:38.994 "buf_cache_size": 4294967295, 00:08:38.994 "dif_insert_or_strip": false, 00:08:38.994 "zcopy": false, 00:08:38.994 "c2h_success": true, 00:08:38.994 "sock_priority": 0, 00:08:38.994 "abort_timeout_sec": 1, 00:08:38.994 "ack_timeout": 0, 00:08:38.994 "data_wr_pool_size": 0 00:08:38.994 } 00:08:38.994 } 00:08:38.994 ] 00:08:38.994 }, 00:08:38.994 { 00:08:38.994 "subsystem": "iscsi", 00:08:38.994 "config": [ 00:08:38.994 { 00:08:38.994 "method": "iscsi_set_options", 00:08:38.994 "params": { 00:08:38.994 "node_base": "iqn.2016-06.io.spdk", 00:08:38.994 "max_sessions": 128, 00:08:38.994 "max_connections_per_session": 2, 00:08:38.994 "max_queue_depth": 64, 00:08:38.994 "default_time2wait": 2, 00:08:38.994 "default_time2retain": 20, 00:08:38.994 "first_burst_length": 8192, 00:08:38.994 "immediate_data": true, 00:08:38.994 "allow_duplicated_isid": false, 00:08:38.994 "error_recovery_level": 0, 00:08:38.994 "nop_timeout": 60, 00:08:38.994 "nop_in_interval": 30, 00:08:38.994 "disable_chap": false, 00:08:38.994 "require_chap": false, 00:08:38.994 "mutual_chap": false, 00:08:38.994 "chap_group": 0, 00:08:38.994 "max_large_datain_per_connection": 64, 00:08:38.994 "max_r2t_per_connection": 4, 00:08:38.994 "pdu_pool_size": 36864, 00:08:38.994 "immediate_data_pool_size": 16384, 00:08:38.994 "data_out_pool_size": 2048 00:08:38.994 } 00:08:38.994 } 00:08:38.994 ] 00:08:38.994 } 00:08:38.994 ] 00:08:38.994 } 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1088501 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1088501 ']' 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1088501 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1088501 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1088501' 00:08:38.994 killing process with pid 1088501 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1088501 00:08:38.994 05:36:53 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1088501 00:08:39.562 05:36:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1088689 00:08:39.562 05:36:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:08:39.562 05:36:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1088689 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1088689 ']' 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1088689 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1088689 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1088689' 00:08:44.894 killing process with pid 1088689 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1088689 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1088689 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:08:44.894 00:08:44.894 real 0m7.031s 00:08:44.894 user 0m6.733s 00:08:44.894 sys 0m0.833s 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:08:44.894 ************************************ 00:08:44.894 END TEST skip_rpc_with_json 00:08:44.894 ************************************ 00:08:44.894 05:36:59 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:44.894 05:36:59 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:08:44.894 05:36:59 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:44.894 05:36:59 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.894 05:36:59 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:44.894 ************************************ 00:08:44.894 START TEST skip_rpc_with_delay 00:08:44.894 ************************************ 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:44.894 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:44.895 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:08:44.895 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:08:44.895 [2024-07-26 05:36:59.769741] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:08:44.895 [2024-07-26 05:36:59.769840] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:08:44.895 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:08:44.895 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:44.895 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:44.895 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:44.895 00:08:44.895 real 0m0.097s 00:08:44.895 user 0m0.058s 00:08:44.895 sys 0m0.039s 00:08:44.895 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:44.895 05:36:59 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:08:44.895 ************************************ 00:08:44.895 END TEST skip_rpc_with_delay 00:08:44.895 ************************************ 00:08:45.153 05:36:59 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:45.153 05:36:59 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:08:45.153 05:36:59 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:08:45.153 05:36:59 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:08:45.153 05:36:59 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:45.153 05:36:59 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:45.153 05:36:59 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:45.153 ************************************ 00:08:45.153 START TEST exit_on_failed_rpc_init 00:08:45.153 ************************************ 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1089449 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1089449 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 1089449 ']' 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:45.153 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:45.153 05:36:59 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:08:45.153 [2024-07-26 05:36:59.951254] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:08:45.153 [2024-07-26 05:36:59.951330] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1089449 ] 00:08:45.411 [2024-07-26 05:37:00.083343] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.411 [2024-07-26 05:37:00.182832] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:08:45.978 05:37:00 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:08:46.237 [2024-07-26 05:37:00.891205] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:08:46.237 [2024-07-26 05:37:00.891276] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1089628 ] 00:08:46.237 [2024-07-26 05:37:01.009928] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.237 [2024-07-26 05:37:01.108160] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:46.237 [2024-07-26 05:37:01.108248] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:08:46.237 [2024-07-26 05:37:01.108265] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:08:46.237 [2024-07-26 05:37:01.108278] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1089449 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 1089449 ']' 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 1089449 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1089449 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1089449' 00:08:46.496 killing process with pid 1089449 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 1089449 00:08:46.496 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 1089449 00:08:46.755 00:08:46.755 real 0m1.777s 00:08:46.755 user 0m2.001s 00:08:46.755 sys 0m0.613s 00:08:46.755 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:46.755 05:37:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:08:46.755 ************************************ 00:08:46.755 END TEST exit_on_failed_rpc_init 00:08:46.755 ************************************ 00:08:47.014 05:37:01 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:47.014 05:37:01 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:08:47.014 00:08:47.014 real 0m14.777s 00:08:47.014 user 0m14.038s 00:08:47.014 sys 0m2.165s 00:08:47.014 05:37:01 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:47.014 05:37:01 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:47.014 ************************************ 00:08:47.014 END TEST skip_rpc 00:08:47.014 ************************************ 00:08:47.014 05:37:01 -- common/autotest_common.sh@1142 -- # return 0 00:08:47.014 05:37:01 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:08:47.014 05:37:01 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:47.014 05:37:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:47.014 05:37:01 -- common/autotest_common.sh@10 -- # set +x 00:08:47.014 ************************************ 00:08:47.014 START TEST rpc_client 00:08:47.014 ************************************ 00:08:47.014 05:37:01 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:08:47.014 * Looking for test storage... 00:08:47.014 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:08:47.014 05:37:01 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:08:47.273 OK 00:08:47.273 05:37:01 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:08:47.273 00:08:47.273 real 0m0.147s 00:08:47.273 user 0m0.058s 00:08:47.273 sys 0m0.099s 00:08:47.273 05:37:01 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:47.273 05:37:01 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:08:47.273 ************************************ 00:08:47.273 END TEST rpc_client 00:08:47.273 ************************************ 00:08:47.273 05:37:01 -- common/autotest_common.sh@1142 -- # return 0 00:08:47.273 05:37:01 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:08:47.273 05:37:01 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:47.273 05:37:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:47.273 05:37:01 -- common/autotest_common.sh@10 -- # set +x 00:08:47.273 ************************************ 00:08:47.273 START TEST json_config 00:08:47.273 ************************************ 00:08:47.273 05:37:02 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:08:47.273 05:37:02 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@7 -- # uname -s 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:47.273 05:37:02 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:08:47.273 05:37:02 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:47.273 05:37:02 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:47.273 05:37:02 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:47.273 05:37:02 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:47.274 05:37:02 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:47.274 05:37:02 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:47.274 05:37:02 json_config -- paths/export.sh@5 -- # export PATH 00:08:47.274 05:37:02 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:47.274 05:37:02 json_config -- nvmf/common.sh@47 -- # : 0 00:08:47.274 05:37:02 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:08:47.274 05:37:02 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:08:47.274 05:37:02 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:08:47.274 05:37:02 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:47.274 05:37:02 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:47.274 05:37:02 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:08:47.274 05:37:02 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:08:47.274 05:37:02 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@359 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@360 -- # echo 'INFO: JSON configuration test init' 00:08:47.274 INFO: JSON configuration test init 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@361 -- # json_config_test_init 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@266 -- # timing_enter json_config_test_init 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@267 -- # timing_enter json_config_setup_target 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:47.274 05:37:02 json_config -- json_config/json_config.sh@269 -- # json_config_test_start_app target --wait-for-rpc 00:08:47.274 05:37:02 json_config -- json_config/common.sh@9 -- # local app=target 00:08:47.274 05:37:02 json_config -- json_config/common.sh@10 -- # shift 00:08:47.274 05:37:02 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:47.274 05:37:02 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:47.274 05:37:02 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:08:47.274 05:37:02 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:47.274 05:37:02 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:47.274 05:37:02 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1089916 00:08:47.274 05:37:02 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:47.274 Waiting for target to run... 00:08:47.274 05:37:02 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:08:47.274 05:37:02 json_config -- json_config/common.sh@25 -- # waitforlisten 1089916 /var/tmp/spdk_tgt.sock 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@829 -- # '[' -z 1089916 ']' 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:47.274 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:47.274 05:37:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:47.274 [2024-07-26 05:37:02.176738] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:08:47.274 [2024-07-26 05:37:02.176809] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1089916 ] 00:08:48.209 [2024-07-26 05:37:02.795845] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.209 [2024-07-26 05:37:02.898676] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.209 05:37:03 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:48.209 05:37:03 json_config -- common/autotest_common.sh@862 -- # return 0 00:08:48.209 05:37:03 json_config -- json_config/common.sh@26 -- # echo '' 00:08:48.209 00:08:48.209 05:37:03 json_config -- json_config/json_config.sh@273 -- # create_accel_config 00:08:48.209 05:37:03 json_config -- json_config/json_config.sh@97 -- # timing_enter create_accel_config 00:08:48.209 05:37:03 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:48.209 05:37:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:48.209 05:37:03 json_config -- json_config/json_config.sh@99 -- # [[ 1 -eq 1 ]] 00:08:48.209 05:37:03 json_config -- json_config/json_config.sh@100 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:08:48.209 05:37:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:08:48.467 05:37:03 json_config -- json_config/json_config.sh@101 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:08:48.467 05:37:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:08:48.725 [2024-07-26 05:37:03.588922] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:08:48.725 05:37:03 json_config -- json_config/json_config.sh@102 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:08:48.725 05:37:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:08:48.983 [2024-07-26 05:37:03.825527] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:08:48.983 05:37:03 json_config -- json_config/json_config.sh@105 -- # timing_exit create_accel_config 00:08:48.983 05:37:03 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:48.983 05:37:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:48.983 05:37:03 json_config -- json_config/json_config.sh@277 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:08:49.240 05:37:03 json_config -- json_config/json_config.sh@278 -- # tgt_rpc load_config 00:08:49.240 05:37:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:08:49.240 [2024-07-26 05:37:04.139026] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@280 -- # tgt_check_notification_types 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:08:52.525 05:37:06 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:52.525 05:37:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:08:52.525 05:37:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@48 -- # local get_types 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@50 -- # local type_diff 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@51 -- # echo bdev_register bdev_unregister bdev_register bdev_unregister 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@51 -- # tr ' ' '\n' 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@51 -- # uniq -u 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@51 -- # sort 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@51 -- # type_diff= 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@53 -- # [[ -n '' ]] 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@58 -- # timing_exit tgt_check_notification_types 00:08:52.525 05:37:06 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:52.525 05:37:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@59 -- # return 0 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@282 -- # [[ 1 -eq 1 ]] 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@283 -- # create_bdev_subsystem_config 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@109 -- # timing_enter create_bdev_subsystem_config 00:08:52.525 05:37:06 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:52.525 05:37:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@111 -- # expected_notifications=() 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@111 -- # local expected_notifications 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@115 -- # expected_notifications+=($(get_notifications)) 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@115 -- # get_notifications 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:08:52.525 05:37:06 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:08:52.525 05:37:06 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:08:52.525 05:37:07 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:08:52.525 05:37:07 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:08:52.525 05:37:07 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:08:52.525 05:37:07 json_config -- json_config/json_config.sh@117 -- # [[ 1 -eq 1 ]] 00:08:52.525 05:37:07 json_config -- json_config/json_config.sh@118 -- # local lvol_store_base_bdev=Nvme0n1 00:08:52.525 05:37:07 json_config -- json_config/json_config.sh@120 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:08:52.525 05:37:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:08:52.784 Nvme0n1p0 Nvme0n1p1 00:08:52.784 05:37:07 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_split_create Malloc0 3 00:08:52.784 05:37:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:08:53.042 [2024-07-26 05:37:07.708440] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:53.042 [2024-07-26 05:37:07.708500] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:53.042 00:08:53.042 05:37:07 json_config -- json_config/json_config.sh@122 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:08:53.042 05:37:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:08:53.300 Malloc3 00:08:53.300 05:37:07 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:08:53.300 05:37:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:08:53.300 [2024-07-26 05:37:08.201833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:53.300 [2024-07-26 05:37:08.201887] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:53.300 [2024-07-26 05:37:08.201910] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcaa2a0 00:08:53.300 [2024-07-26 05:37:08.201923] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:53.300 [2024-07-26 05:37:08.203806] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:53.300 [2024-07-26 05:37:08.203844] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:08:53.300 PTBdevFromMalloc3 00:08:53.559 05:37:08 json_config -- json_config/json_config.sh@125 -- # tgt_rpc bdev_null_create Null0 32 512 00:08:53.559 05:37:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:08:53.559 Null0 00:08:53.817 05:37:08 json_config -- json_config/json_config.sh@127 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:08:53.817 05:37:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:08:53.817 Malloc0 00:08:53.817 05:37:08 json_config -- json_config/json_config.sh@128 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:08:53.817 05:37:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:08:54.076 Malloc1 00:08:54.076 05:37:08 json_config -- json_config/json_config.sh@141 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:08:54.076 05:37:08 json_config -- json_config/json_config.sh@144 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:08:54.642 102400+0 records in 00:08:54.642 102400+0 records out 00:08:54.642 104857600 bytes (105 MB, 100 MiB) copied, 0.306411 s, 342 MB/s 00:08:54.642 05:37:09 json_config -- json_config/json_config.sh@145 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:08:54.642 05:37:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:08:54.642 aio_disk 00:08:54.900 05:37:09 json_config -- json_config/json_config.sh@146 -- # expected_notifications+=(bdev_register:aio_disk) 00:08:54.901 05:37:09 json_config -- json_config/json_config.sh@151 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:08:54.901 05:37:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:09:00.167 d358c332-6a2e-49a7-a6a7-3ea8d0bebe4b 00:09:00.167 05:37:14 json_config -- json_config/json_config.sh@158 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:09:00.167 05:37:14 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:09:00.167 05:37:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:09:00.167 05:37:14 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:09:00.167 05:37:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:09:00.167 05:37:14 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:09:00.167 05:37:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:09:00.167 05:37:14 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:09:00.167 05:37:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:09:00.426 05:37:15 json_config -- json_config/json_config.sh@161 -- # [[ 1 -eq 1 ]] 00:09:00.426 05:37:15 json_config -- json_config/json_config.sh@162 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:09:00.426 05:37:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:09:00.684 MallocForCryptoBdev 00:09:00.684 05:37:15 json_config -- json_config/json_config.sh@163 -- # lspci -d:37c8 00:09:00.684 05:37:15 json_config -- json_config/json_config.sh@163 -- # wc -l 00:09:00.684 05:37:15 json_config -- json_config/json_config.sh@163 -- # [[ 3 -eq 0 ]] 00:09:00.684 05:37:15 json_config -- json_config/json_config.sh@166 -- # local crypto_driver=crypto_qat 00:09:00.684 05:37:15 json_config -- json_config/json_config.sh@169 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:09:00.684 05:37:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:09:00.943 [2024-07-26 05:37:15.693911] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:09:00.943 CryptoMallocBdev 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@173 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@176 -- # [[ 0 -eq 1 ]] 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@182 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:9a7b1e3f-b5fc-4452-9f4f-13d1cc550cfc bdev_register:a52da00a-63b3-46ea-ac11-74c038581454 bdev_register:a0c7966e-7e20-4e8c-85a6-0a1428880212 bdev_register:007c8af6-6a5a-4290-afb7-cf5375cb1fc7 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@71 -- # local events_to_check 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@72 -- # local recorded_events 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@75 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@75 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:9a7b1e3f-b5fc-4452-9f4f-13d1cc550cfc bdev_register:a52da00a-63b3-46ea-ac11-74c038581454 bdev_register:a0c7966e-7e20-4e8c-85a6-0a1428880212 bdev_register:007c8af6-6a5a-4290-afb7-cf5375cb1fc7 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@75 -- # sort 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@76 -- # recorded_events=($(get_notifications | sort)) 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@76 -- # get_notifications 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@76 -- # sort 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:09:00.943 05:37:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:09:00.943 05:37:15 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p1 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p0 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc3 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:PTBdevFromMalloc3 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Null0 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.242 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p2 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p1 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p0 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc1 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:aio_disk 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:9a7b1e3f-b5fc-4452-9f4f-13d1cc550cfc 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:a52da00a-63b3-46ea-ac11-74c038581454 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:a0c7966e-7e20-4e8c-85a6-0a1428880212 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:007c8af6-6a5a-4290-afb7-cf5375cb1fc7 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:MallocForCryptoBdev 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:CryptoMallocBdev 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@78 -- # [[ bdev_register:007c8af6-6a5a-4290-afb7-cf5375cb1fc7 bdev_register:9a7b1e3f-b5fc-4452-9f4f-13d1cc550cfc bdev_register:a0c7966e-7e20-4e8c-85a6-0a1428880212 bdev_register:a52da00a-63b3-46ea-ac11-74c038581454 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\0\7\c\8\a\f\6\-\6\a\5\a\-\4\2\9\0\-\a\f\b\7\-\c\f\5\3\7\5\c\b\1\f\c\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\9\a\7\b\1\e\3\f\-\b\5\f\c\-\4\4\5\2\-\9\f\4\f\-\1\3\d\1\c\c\5\5\0\c\f\c\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\0\c\7\9\6\6\e\-\7\e\2\0\-\4\e\8\c\-\8\5\a\6\-\0\a\1\4\2\8\8\8\0\2\1\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\5\2\d\a\0\0\a\-\6\3\b\3\-\4\6\e\a\-\a\c\1\1\-\7\4\c\0\3\8\5\8\1\4\5\4\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@90 -- # cat 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@90 -- # printf ' %s\n' bdev_register:007c8af6-6a5a-4290-afb7-cf5375cb1fc7 bdev_register:9a7b1e3f-b5fc-4452-9f4f-13d1cc550cfc bdev_register:a0c7966e-7e20-4e8c-85a6-0a1428880212 bdev_register:a52da00a-63b3-46ea-ac11-74c038581454 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:09:01.243 Expected events matched: 00:09:01.243 bdev_register:007c8af6-6a5a-4290-afb7-cf5375cb1fc7 00:09:01.243 bdev_register:9a7b1e3f-b5fc-4452-9f4f-13d1cc550cfc 00:09:01.243 bdev_register:a0c7966e-7e20-4e8c-85a6-0a1428880212 00:09:01.243 bdev_register:a52da00a-63b3-46ea-ac11-74c038581454 00:09:01.243 bdev_register:aio_disk 00:09:01.243 bdev_register:CryptoMallocBdev 00:09:01.243 bdev_register:Malloc0 00:09:01.243 bdev_register:Malloc0p0 00:09:01.243 bdev_register:Malloc0p1 00:09:01.243 bdev_register:Malloc0p2 00:09:01.243 bdev_register:Malloc1 00:09:01.243 bdev_register:Malloc3 00:09:01.243 bdev_register:MallocForCryptoBdev 00:09:01.243 bdev_register:Null0 00:09:01.243 bdev_register:Nvme0n1 00:09:01.243 bdev_register:Nvme0n1p0 00:09:01.243 bdev_register:Nvme0n1p1 00:09:01.243 bdev_register:PTBdevFromMalloc3 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@184 -- # timing_exit create_bdev_subsystem_config 00:09:01.243 05:37:15 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:01.243 05:37:15 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@294 -- # [[ 0 -eq 1 ]] 00:09:01.243 05:37:15 json_config -- json_config/json_config.sh@297 -- # timing_exit json_config_setup_target 00:09:01.243 05:37:15 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:01.243 05:37:15 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:01.243 05:37:16 json_config -- json_config/json_config.sh@299 -- # [[ 0 -eq 1 ]] 00:09:01.243 05:37:16 json_config -- json_config/json_config.sh@304 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:09:01.243 05:37:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:09:01.501 MallocBdevForConfigChangeCheck 00:09:01.501 05:37:16 json_config -- json_config/json_config.sh@306 -- # timing_exit json_config_test_init 00:09:01.501 05:37:16 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:01.501 05:37:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:01.501 05:37:16 json_config -- json_config/json_config.sh@363 -- # tgt_rpc save_config 00:09:01.501 05:37:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:09:01.759 05:37:16 json_config -- json_config/json_config.sh@365 -- # echo 'INFO: shutting down applications...' 00:09:01.759 INFO: shutting down applications... 00:09:01.759 05:37:16 json_config -- json_config/json_config.sh@366 -- # [[ 0 -eq 1 ]] 00:09:01.759 05:37:16 json_config -- json_config/json_config.sh@372 -- # json_config_clear target 00:09:01.759 05:37:16 json_config -- json_config/json_config.sh@336 -- # [[ -n 22 ]] 00:09:01.759 05:37:16 json_config -- json_config/json_config.sh@337 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:09:02.017 [2024-07-26 05:37:16.797333] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:09:05.301 Calling clear_iscsi_subsystem 00:09:05.301 Calling clear_nvmf_subsystem 00:09:05.301 Calling clear_nbd_subsystem 00:09:05.301 Calling clear_ublk_subsystem 00:09:05.301 Calling clear_vhost_blk_subsystem 00:09:05.301 Calling clear_vhost_scsi_subsystem 00:09:05.301 Calling clear_bdev_subsystem 00:09:05.301 05:37:19 json_config -- json_config/json_config.sh@341 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:09:05.301 05:37:19 json_config -- json_config/json_config.sh@347 -- # count=100 00:09:05.301 05:37:19 json_config -- json_config/json_config.sh@348 -- # '[' 100 -gt 0 ']' 00:09:05.301 05:37:19 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:09:05.301 05:37:19 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:09:05.301 05:37:19 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:09:05.301 05:37:19 json_config -- json_config/json_config.sh@349 -- # break 00:09:05.301 05:37:19 json_config -- json_config/json_config.sh@354 -- # '[' 100 -eq 0 ']' 00:09:05.301 05:37:19 json_config -- json_config/json_config.sh@373 -- # json_config_test_shutdown_app target 00:09:05.301 05:37:19 json_config -- json_config/common.sh@31 -- # local app=target 00:09:05.301 05:37:19 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:09:05.301 05:37:19 json_config -- json_config/common.sh@35 -- # [[ -n 1089916 ]] 00:09:05.301 05:37:19 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1089916 00:09:05.301 05:37:19 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:09:05.301 05:37:19 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:09:05.301 05:37:19 json_config -- json_config/common.sh@41 -- # kill -0 1089916 00:09:05.301 05:37:19 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:09:05.560 05:37:20 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:09:05.560 05:37:20 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:09:05.560 05:37:20 json_config -- json_config/common.sh@41 -- # kill -0 1089916 00:09:05.560 05:37:20 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:09:05.560 05:37:20 json_config -- json_config/common.sh@43 -- # break 00:09:05.560 05:37:20 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:09:05.560 05:37:20 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:09:05.560 SPDK target shutdown done 00:09:05.560 05:37:20 json_config -- json_config/json_config.sh@375 -- # echo 'INFO: relaunching applications...' 00:09:05.560 INFO: relaunching applications... 00:09:05.560 05:37:20 json_config -- json_config/json_config.sh@376 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:05.560 05:37:20 json_config -- json_config/common.sh@9 -- # local app=target 00:09:05.560 05:37:20 json_config -- json_config/common.sh@10 -- # shift 00:09:05.560 05:37:20 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:09:05.560 05:37:20 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:09:05.560 05:37:20 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:09:05.560 05:37:20 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:09:05.560 05:37:20 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:09:05.560 05:37:20 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1092472 00:09:05.560 05:37:20 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:09:05.560 Waiting for target to run... 00:09:05.560 05:37:20 json_config -- json_config/common.sh@25 -- # waitforlisten 1092472 /var/tmp/spdk_tgt.sock 00:09:05.560 05:37:20 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:05.560 05:37:20 json_config -- common/autotest_common.sh@829 -- # '[' -z 1092472 ']' 00:09:05.560 05:37:20 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:09:05.560 05:37:20 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:05.560 05:37:20 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:09:05.560 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:09:05.560 05:37:20 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:05.560 05:37:20 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:05.560 [2024-07-26 05:37:20.437924] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:05.560 [2024-07-26 05:37:20.438008] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1092472 ] 00:09:06.127 [2024-07-26 05:37:21.028607] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.386 [2024-07-26 05:37:21.134827] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.386 [2024-07-26 05:37:21.189048] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:09:06.386 [2024-07-26 05:37:21.197084] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:09:06.386 [2024-07-26 05:37:21.205102] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:09:06.386 [2024-07-26 05:37:21.286332] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:09:08.918 [2024-07-26 05:37:23.501669] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:08.918 [2024-07-26 05:37:23.501738] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:08.918 [2024-07-26 05:37:23.501754] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:08.918 [2024-07-26 05:37:23.509683] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:09:08.918 [2024-07-26 05:37:23.509708] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:09:08.918 [2024-07-26 05:37:23.517695] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:09:08.918 [2024-07-26 05:37:23.517719] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:09:08.918 [2024-07-26 05:37:23.525729] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:09:08.918 [2024-07-26 05:37:23.525758] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:09:08.918 [2024-07-26 05:37:23.525771] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:09.176 [2024-07-26 05:37:23.902430] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:09.176 [2024-07-26 05:37:23.902479] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:09.176 [2024-07-26 05:37:23.902497] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2614090 00:09:09.176 [2024-07-26 05:37:23.902509] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:09.176 [2024-07-26 05:37:23.902809] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:09.176 [2024-07-26 05:37:23.902828] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:09:09.176 05:37:24 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:09.176 05:37:24 json_config -- common/autotest_common.sh@862 -- # return 0 00:09:09.176 05:37:24 json_config -- json_config/common.sh@26 -- # echo '' 00:09:09.176 00:09:09.176 05:37:24 json_config -- json_config/json_config.sh@377 -- # [[ 0 -eq 1 ]] 00:09:09.176 05:37:24 json_config -- json_config/json_config.sh@381 -- # echo 'INFO: Checking if target configuration is the same...' 00:09:09.176 INFO: Checking if target configuration is the same... 00:09:09.176 05:37:24 json_config -- json_config/json_config.sh@382 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:09.176 05:37:24 json_config -- json_config/json_config.sh@382 -- # tgt_rpc save_config 00:09:09.176 05:37:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:09:09.176 + '[' 2 -ne 2 ']' 00:09:09.176 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:09:09.176 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:09:09.176 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:09.176 +++ basename /dev/fd/62 00:09:09.176 ++ mktemp /tmp/62.XXX 00:09:09.176 + tmp_file_1=/tmp/62.02r 00:09:09.176 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:09.176 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:09:09.176 + tmp_file_2=/tmp/spdk_tgt_config.json.SOJ 00:09:09.176 + ret=0 00:09:09.176 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:09:09.753 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:09:09.753 + diff -u /tmp/62.02r /tmp/spdk_tgt_config.json.SOJ 00:09:09.753 + echo 'INFO: JSON config files are the same' 00:09:09.753 INFO: JSON config files are the same 00:09:09.753 + rm /tmp/62.02r /tmp/spdk_tgt_config.json.SOJ 00:09:09.753 + exit 0 00:09:09.753 05:37:24 json_config -- json_config/json_config.sh@383 -- # [[ 0 -eq 1 ]] 00:09:09.753 05:37:24 json_config -- json_config/json_config.sh@388 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:09:09.753 INFO: changing configuration and checking if this can be detected... 00:09:09.753 05:37:24 json_config -- json_config/json_config.sh@390 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:09:09.753 05:37:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:09:10.018 05:37:24 json_config -- json_config/json_config.sh@391 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:10.018 05:37:24 json_config -- json_config/json_config.sh@391 -- # tgt_rpc save_config 00:09:10.018 05:37:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:09:10.018 + '[' 2 -ne 2 ']' 00:09:10.018 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:09:10.018 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:09:10.018 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:10.018 +++ basename /dev/fd/62 00:09:10.018 ++ mktemp /tmp/62.XXX 00:09:10.018 + tmp_file_1=/tmp/62.N3M 00:09:10.018 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:10.018 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:09:10.018 + tmp_file_2=/tmp/spdk_tgt_config.json.lNV 00:09:10.018 + ret=0 00:09:10.018 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:09:10.275 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:09:10.275 + diff -u /tmp/62.N3M /tmp/spdk_tgt_config.json.lNV 00:09:10.275 + ret=1 00:09:10.275 + echo '=== Start of file: /tmp/62.N3M ===' 00:09:10.275 + cat /tmp/62.N3M 00:09:10.275 + echo '=== End of file: /tmp/62.N3M ===' 00:09:10.275 + echo '' 00:09:10.275 + echo '=== Start of file: /tmp/spdk_tgt_config.json.lNV ===' 00:09:10.275 + cat /tmp/spdk_tgt_config.json.lNV 00:09:10.275 + echo '=== End of file: /tmp/spdk_tgt_config.json.lNV ===' 00:09:10.275 + echo '' 00:09:10.275 + rm /tmp/62.N3M /tmp/spdk_tgt_config.json.lNV 00:09:10.275 + exit 1 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@395 -- # echo 'INFO: configuration change detected.' 00:09:10.275 INFO: configuration change detected. 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@398 -- # json_config_test_fini 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@310 -- # timing_enter json_config_test_fini 00:09:10.275 05:37:25 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:09:10.275 05:37:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@311 -- # local ret=0 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@313 -- # [[ -n '' ]] 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@321 -- # [[ -n 1092472 ]] 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@324 -- # cleanup_bdev_subsystem_config 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@188 -- # timing_enter cleanup_bdev_subsystem_config 00:09:10.275 05:37:25 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:09:10.275 05:37:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@190 -- # [[ 1 -eq 1 ]] 00:09:10.275 05:37:25 json_config -- json_config/json_config.sh@191 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:09:10.275 05:37:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:09:10.533 05:37:25 json_config -- json_config/json_config.sh@192 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:09:10.533 05:37:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:09:10.792 05:37:25 json_config -- json_config/json_config.sh@193 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:09:10.792 05:37:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:09:11.050 05:37:25 json_config -- json_config/json_config.sh@194 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:09:11.050 05:37:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:09:11.309 05:37:26 json_config -- json_config/json_config.sh@197 -- # uname -s 00:09:11.309 05:37:26 json_config -- json_config/json_config.sh@197 -- # [[ Linux = Linux ]] 00:09:11.309 05:37:26 json_config -- json_config/json_config.sh@198 -- # rm -f /sample_aio 00:09:11.309 05:37:26 json_config -- json_config/json_config.sh@201 -- # [[ 0 -eq 1 ]] 00:09:11.309 05:37:26 json_config -- json_config/json_config.sh@205 -- # timing_exit cleanup_bdev_subsystem_config 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:11.309 05:37:26 json_config -- json_config/json_config.sh@327 -- # killprocess 1092472 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@948 -- # '[' -z 1092472 ']' 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@952 -- # kill -0 1092472 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@953 -- # uname 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1092472 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1092472' 00:09:11.309 killing process with pid 1092472 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@967 -- # kill 1092472 00:09:11.309 05:37:26 json_config -- common/autotest_common.sh@972 -- # wait 1092472 00:09:14.594 05:37:29 json_config -- json_config/json_config.sh@330 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:09:14.594 05:37:29 json_config -- json_config/json_config.sh@331 -- # timing_exit json_config_test_fini 00:09:14.594 05:37:29 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:14.594 05:37:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:14.594 05:37:29 json_config -- json_config/json_config.sh@332 -- # return 0 00:09:14.594 05:37:29 json_config -- json_config/json_config.sh@400 -- # echo 'INFO: Success' 00:09:14.594 INFO: Success 00:09:14.594 00:09:14.594 real 0m27.202s 00:09:14.594 user 0m32.721s 00:09:14.594 sys 0m4.228s 00:09:14.594 05:37:29 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:14.594 05:37:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:09:14.594 ************************************ 00:09:14.594 END TEST json_config 00:09:14.594 ************************************ 00:09:14.594 05:37:29 -- common/autotest_common.sh@1142 -- # return 0 00:09:14.594 05:37:29 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:09:14.594 05:37:29 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:14.594 05:37:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:14.594 05:37:29 -- common/autotest_common.sh@10 -- # set +x 00:09:14.594 ************************************ 00:09:14.594 START TEST json_config_extra_key 00:09:14.594 ************************************ 00:09:14.594 05:37:29 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:09:14.594 05:37:29 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:14.594 05:37:29 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:14.594 05:37:29 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:14.594 05:37:29 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:14.594 05:37:29 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:14.594 05:37:29 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:14.594 05:37:29 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:09:14.594 05:37:29 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:09:14.594 05:37:29 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:09:14.594 INFO: launching applications... 00:09:14.594 05:37:29 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1093705 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:09:14.595 Waiting for target to run... 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1093705 /var/tmp/spdk_tgt.sock 00:09:14.595 05:37:29 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 1093705 ']' 00:09:14.595 05:37:29 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:09:14.595 05:37:29 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:09:14.595 05:37:29 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:14.595 05:37:29 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:09:14.595 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:09:14.595 05:37:29 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:14.595 05:37:29 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:09:14.595 [2024-07-26 05:37:29.484828] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:14.595 [2024-07-26 05:37:29.484910] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1093705 ] 00:09:15.162 [2024-07-26 05:37:30.059363] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:15.419 [2024-07-26 05:37:30.153493] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:15.677 05:37:30 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:15.677 05:37:30 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:09:15.677 05:37:30 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:09:15.677 00:09:15.677 05:37:30 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:09:15.677 INFO: shutting down applications... 00:09:15.677 05:37:30 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:09:15.677 05:37:30 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:09:15.677 05:37:30 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:09:15.677 05:37:30 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1093705 ]] 00:09:15.677 05:37:30 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1093705 00:09:15.677 05:37:30 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:09:15.677 05:37:30 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:09:15.677 05:37:30 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1093705 00:09:15.677 05:37:30 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:09:16.243 05:37:30 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:09:16.243 05:37:30 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:09:16.243 05:37:30 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1093705 00:09:16.243 05:37:30 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:09:16.243 05:37:30 json_config_extra_key -- json_config/common.sh@43 -- # break 00:09:16.243 05:37:30 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:09:16.243 05:37:30 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:09:16.243 SPDK target shutdown done 00:09:16.243 05:37:30 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:09:16.243 Success 00:09:16.243 00:09:16.243 real 0m1.549s 00:09:16.243 user 0m0.947s 00:09:16.243 sys 0m0.731s 00:09:16.243 05:37:30 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:16.243 05:37:30 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:09:16.243 ************************************ 00:09:16.243 END TEST json_config_extra_key 00:09:16.243 ************************************ 00:09:16.243 05:37:30 -- common/autotest_common.sh@1142 -- # return 0 00:09:16.243 05:37:30 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:09:16.243 05:37:30 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:16.243 05:37:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.243 05:37:30 -- common/autotest_common.sh@10 -- # set +x 00:09:16.243 ************************************ 00:09:16.243 START TEST alias_rpc 00:09:16.243 ************************************ 00:09:16.244 05:37:30 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:09:16.244 * Looking for test storage... 00:09:16.244 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:09:16.244 05:37:31 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:16.244 05:37:31 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1093994 00:09:16.244 05:37:31 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1093994 00:09:16.244 05:37:31 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 1093994 ']' 00:09:16.244 05:37:31 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:16.244 05:37:31 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:16.244 05:37:31 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:16.244 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:16.244 05:37:31 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:16.244 05:37:31 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:16.244 05:37:31 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:16.244 [2024-07-26 05:37:31.078890] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:16.244 [2024-07-26 05:37:31.078972] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1093994 ] 00:09:16.501 [2024-07-26 05:37:31.206365] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:16.501 [2024-07-26 05:37:31.309712] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:17.437 05:37:31 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:17.437 05:37:31 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:09:17.437 05:37:31 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:09:17.694 05:37:32 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1093994 00:09:17.694 05:37:32 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 1093994 ']' 00:09:17.694 05:37:32 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 1093994 00:09:17.694 05:37:32 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:09:17.694 05:37:32 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:17.695 05:37:32 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1093994 00:09:17.695 05:37:32 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:17.695 05:37:32 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:17.695 05:37:32 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1093994' 00:09:17.695 killing process with pid 1093994 00:09:17.695 05:37:32 alias_rpc -- common/autotest_common.sh@967 -- # kill 1093994 00:09:17.695 05:37:32 alias_rpc -- common/autotest_common.sh@972 -- # wait 1093994 00:09:18.261 00:09:18.261 real 0m2.055s 00:09:18.261 user 0m2.503s 00:09:18.261 sys 0m0.564s 00:09:18.261 05:37:32 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:18.261 05:37:32 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:18.261 ************************************ 00:09:18.261 END TEST alias_rpc 00:09:18.261 ************************************ 00:09:18.261 05:37:33 -- common/autotest_common.sh@1142 -- # return 0 00:09:18.261 05:37:33 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:09:18.261 05:37:33 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:09:18.261 05:37:33 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:18.261 05:37:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:18.261 05:37:33 -- common/autotest_common.sh@10 -- # set +x 00:09:18.261 ************************************ 00:09:18.261 START TEST spdkcli_tcp 00:09:18.261 ************************************ 00:09:18.261 05:37:33 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:09:18.261 * Looking for test storage... 00:09:18.261 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:09:18.261 05:37:33 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:09:18.261 05:37:33 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1094337 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1094337 00:09:18.261 05:37:33 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:09:18.261 05:37:33 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 1094337 ']' 00:09:18.261 05:37:33 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:18.261 05:37:33 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:18.261 05:37:33 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:18.261 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:18.261 05:37:33 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:18.261 05:37:33 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:09:18.519 [2024-07-26 05:37:33.229353] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:18.519 [2024-07-26 05:37:33.229432] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1094337 ] 00:09:18.519 [2024-07-26 05:37:33.362054] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:18.777 [2024-07-26 05:37:33.470186] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:18.777 [2024-07-26 05:37:33.470192] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:19.390 05:37:34 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:19.390 05:37:34 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:09:19.390 05:37:34 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1094513 00:09:19.390 05:37:34 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:09:19.390 05:37:34 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:09:19.390 [ 00:09:19.390 "bdev_malloc_delete", 00:09:19.390 "bdev_malloc_create", 00:09:19.390 "bdev_null_resize", 00:09:19.390 "bdev_null_delete", 00:09:19.390 "bdev_null_create", 00:09:19.390 "bdev_nvme_cuse_unregister", 00:09:19.390 "bdev_nvme_cuse_register", 00:09:19.390 "bdev_opal_new_user", 00:09:19.390 "bdev_opal_set_lock_state", 00:09:19.390 "bdev_opal_delete", 00:09:19.390 "bdev_opal_get_info", 00:09:19.390 "bdev_opal_create", 00:09:19.390 "bdev_nvme_opal_revert", 00:09:19.390 "bdev_nvme_opal_init", 00:09:19.390 "bdev_nvme_send_cmd", 00:09:19.390 "bdev_nvme_get_path_iostat", 00:09:19.390 "bdev_nvme_get_mdns_discovery_info", 00:09:19.390 "bdev_nvme_stop_mdns_discovery", 00:09:19.390 "bdev_nvme_start_mdns_discovery", 00:09:19.390 "bdev_nvme_set_multipath_policy", 00:09:19.390 "bdev_nvme_set_preferred_path", 00:09:19.390 "bdev_nvme_get_io_paths", 00:09:19.390 "bdev_nvme_remove_error_injection", 00:09:19.390 "bdev_nvme_add_error_injection", 00:09:19.390 "bdev_nvme_get_discovery_info", 00:09:19.390 "bdev_nvme_stop_discovery", 00:09:19.390 "bdev_nvme_start_discovery", 00:09:19.390 "bdev_nvme_get_controller_health_info", 00:09:19.390 "bdev_nvme_disable_controller", 00:09:19.390 "bdev_nvme_enable_controller", 00:09:19.390 "bdev_nvme_reset_controller", 00:09:19.390 "bdev_nvme_get_transport_statistics", 00:09:19.390 "bdev_nvme_apply_firmware", 00:09:19.390 "bdev_nvme_detach_controller", 00:09:19.390 "bdev_nvme_get_controllers", 00:09:19.390 "bdev_nvme_attach_controller", 00:09:19.390 "bdev_nvme_set_hotplug", 00:09:19.390 "bdev_nvme_set_options", 00:09:19.390 "bdev_passthru_delete", 00:09:19.390 "bdev_passthru_create", 00:09:19.390 "bdev_lvol_set_parent_bdev", 00:09:19.390 "bdev_lvol_set_parent", 00:09:19.390 "bdev_lvol_check_shallow_copy", 00:09:19.390 "bdev_lvol_start_shallow_copy", 00:09:19.390 "bdev_lvol_grow_lvstore", 00:09:19.390 "bdev_lvol_get_lvols", 00:09:19.390 "bdev_lvol_get_lvstores", 00:09:19.390 "bdev_lvol_delete", 00:09:19.390 "bdev_lvol_set_read_only", 00:09:19.390 "bdev_lvol_resize", 00:09:19.390 "bdev_lvol_decouple_parent", 00:09:19.390 "bdev_lvol_inflate", 00:09:19.390 "bdev_lvol_rename", 00:09:19.391 "bdev_lvol_clone_bdev", 00:09:19.391 "bdev_lvol_clone", 00:09:19.391 "bdev_lvol_snapshot", 00:09:19.391 "bdev_lvol_create", 00:09:19.391 "bdev_lvol_delete_lvstore", 00:09:19.391 "bdev_lvol_rename_lvstore", 00:09:19.391 "bdev_lvol_create_lvstore", 00:09:19.391 "bdev_raid_set_options", 00:09:19.391 "bdev_raid_remove_base_bdev", 00:09:19.391 "bdev_raid_add_base_bdev", 00:09:19.391 "bdev_raid_delete", 00:09:19.391 "bdev_raid_create", 00:09:19.391 "bdev_raid_get_bdevs", 00:09:19.391 "bdev_error_inject_error", 00:09:19.391 "bdev_error_delete", 00:09:19.391 "bdev_error_create", 00:09:19.391 "bdev_split_delete", 00:09:19.391 "bdev_split_create", 00:09:19.391 "bdev_delay_delete", 00:09:19.391 "bdev_delay_create", 00:09:19.391 "bdev_delay_update_latency", 00:09:19.391 "bdev_zone_block_delete", 00:09:19.391 "bdev_zone_block_create", 00:09:19.391 "blobfs_create", 00:09:19.391 "blobfs_detect", 00:09:19.391 "blobfs_set_cache_size", 00:09:19.391 "bdev_crypto_delete", 00:09:19.391 "bdev_crypto_create", 00:09:19.391 "bdev_compress_delete", 00:09:19.391 "bdev_compress_create", 00:09:19.391 "bdev_compress_get_orphans", 00:09:19.391 "bdev_aio_delete", 00:09:19.391 "bdev_aio_rescan", 00:09:19.391 "bdev_aio_create", 00:09:19.391 "bdev_ftl_set_property", 00:09:19.391 "bdev_ftl_get_properties", 00:09:19.391 "bdev_ftl_get_stats", 00:09:19.391 "bdev_ftl_unmap", 00:09:19.391 "bdev_ftl_unload", 00:09:19.391 "bdev_ftl_delete", 00:09:19.391 "bdev_ftl_load", 00:09:19.391 "bdev_ftl_create", 00:09:19.391 "bdev_virtio_attach_controller", 00:09:19.391 "bdev_virtio_scsi_get_devices", 00:09:19.391 "bdev_virtio_detach_controller", 00:09:19.391 "bdev_virtio_blk_set_hotplug", 00:09:19.391 "bdev_iscsi_delete", 00:09:19.391 "bdev_iscsi_create", 00:09:19.391 "bdev_iscsi_set_options", 00:09:19.391 "accel_error_inject_error", 00:09:19.391 "ioat_scan_accel_module", 00:09:19.391 "dsa_scan_accel_module", 00:09:19.391 "iaa_scan_accel_module", 00:09:19.391 "dpdk_cryptodev_get_driver", 00:09:19.391 "dpdk_cryptodev_set_driver", 00:09:19.391 "dpdk_cryptodev_scan_accel_module", 00:09:19.391 "compressdev_scan_accel_module", 00:09:19.391 "keyring_file_remove_key", 00:09:19.391 "keyring_file_add_key", 00:09:19.391 "keyring_linux_set_options", 00:09:19.391 "iscsi_get_histogram", 00:09:19.391 "iscsi_enable_histogram", 00:09:19.391 "iscsi_set_options", 00:09:19.391 "iscsi_get_auth_groups", 00:09:19.391 "iscsi_auth_group_remove_secret", 00:09:19.391 "iscsi_auth_group_add_secret", 00:09:19.391 "iscsi_delete_auth_group", 00:09:19.391 "iscsi_create_auth_group", 00:09:19.391 "iscsi_set_discovery_auth", 00:09:19.391 "iscsi_get_options", 00:09:19.391 "iscsi_target_node_request_logout", 00:09:19.391 "iscsi_target_node_set_redirect", 00:09:19.391 "iscsi_target_node_set_auth", 00:09:19.391 "iscsi_target_node_add_lun", 00:09:19.391 "iscsi_get_stats", 00:09:19.391 "iscsi_get_connections", 00:09:19.391 "iscsi_portal_group_set_auth", 00:09:19.391 "iscsi_start_portal_group", 00:09:19.391 "iscsi_delete_portal_group", 00:09:19.391 "iscsi_create_portal_group", 00:09:19.391 "iscsi_get_portal_groups", 00:09:19.391 "iscsi_delete_target_node", 00:09:19.391 "iscsi_target_node_remove_pg_ig_maps", 00:09:19.391 "iscsi_target_node_add_pg_ig_maps", 00:09:19.391 "iscsi_create_target_node", 00:09:19.391 "iscsi_get_target_nodes", 00:09:19.391 "iscsi_delete_initiator_group", 00:09:19.391 "iscsi_initiator_group_remove_initiators", 00:09:19.391 "iscsi_initiator_group_add_initiators", 00:09:19.391 "iscsi_create_initiator_group", 00:09:19.391 "iscsi_get_initiator_groups", 00:09:19.391 "nvmf_set_crdt", 00:09:19.391 "nvmf_set_config", 00:09:19.391 "nvmf_set_max_subsystems", 00:09:19.391 "nvmf_stop_mdns_prr", 00:09:19.391 "nvmf_publish_mdns_prr", 00:09:19.391 "nvmf_subsystem_get_listeners", 00:09:19.391 "nvmf_subsystem_get_qpairs", 00:09:19.391 "nvmf_subsystem_get_controllers", 00:09:19.391 "nvmf_get_stats", 00:09:19.391 "nvmf_get_transports", 00:09:19.391 "nvmf_create_transport", 00:09:19.391 "nvmf_get_targets", 00:09:19.391 "nvmf_delete_target", 00:09:19.391 "nvmf_create_target", 00:09:19.391 "nvmf_subsystem_allow_any_host", 00:09:19.391 "nvmf_subsystem_remove_host", 00:09:19.391 "nvmf_subsystem_add_host", 00:09:19.391 "nvmf_ns_remove_host", 00:09:19.391 "nvmf_ns_add_host", 00:09:19.391 "nvmf_subsystem_remove_ns", 00:09:19.391 "nvmf_subsystem_add_ns", 00:09:19.391 "nvmf_subsystem_listener_set_ana_state", 00:09:19.391 "nvmf_discovery_get_referrals", 00:09:19.391 "nvmf_discovery_remove_referral", 00:09:19.391 "nvmf_discovery_add_referral", 00:09:19.391 "nvmf_subsystem_remove_listener", 00:09:19.391 "nvmf_subsystem_add_listener", 00:09:19.391 "nvmf_delete_subsystem", 00:09:19.391 "nvmf_create_subsystem", 00:09:19.391 "nvmf_get_subsystems", 00:09:19.391 "env_dpdk_get_mem_stats", 00:09:19.391 "nbd_get_disks", 00:09:19.391 "nbd_stop_disk", 00:09:19.391 "nbd_start_disk", 00:09:19.391 "ublk_recover_disk", 00:09:19.391 "ublk_get_disks", 00:09:19.391 "ublk_stop_disk", 00:09:19.391 "ublk_start_disk", 00:09:19.391 "ublk_destroy_target", 00:09:19.391 "ublk_create_target", 00:09:19.391 "virtio_blk_create_transport", 00:09:19.391 "virtio_blk_get_transports", 00:09:19.391 "vhost_controller_set_coalescing", 00:09:19.391 "vhost_get_controllers", 00:09:19.391 "vhost_delete_controller", 00:09:19.391 "vhost_create_blk_controller", 00:09:19.391 "vhost_scsi_controller_remove_target", 00:09:19.391 "vhost_scsi_controller_add_target", 00:09:19.391 "vhost_start_scsi_controller", 00:09:19.391 "vhost_create_scsi_controller", 00:09:19.391 "thread_set_cpumask", 00:09:19.391 "framework_get_governor", 00:09:19.391 "framework_get_scheduler", 00:09:19.391 "framework_set_scheduler", 00:09:19.391 "framework_get_reactors", 00:09:19.391 "thread_get_io_channels", 00:09:19.391 "thread_get_pollers", 00:09:19.391 "thread_get_stats", 00:09:19.391 "framework_monitor_context_switch", 00:09:19.391 "spdk_kill_instance", 00:09:19.391 "log_enable_timestamps", 00:09:19.391 "log_get_flags", 00:09:19.391 "log_clear_flag", 00:09:19.391 "log_set_flag", 00:09:19.391 "log_get_level", 00:09:19.391 "log_set_level", 00:09:19.391 "log_get_print_level", 00:09:19.391 "log_set_print_level", 00:09:19.391 "framework_enable_cpumask_locks", 00:09:19.391 "framework_disable_cpumask_locks", 00:09:19.391 "framework_wait_init", 00:09:19.391 "framework_start_init", 00:09:19.391 "scsi_get_devices", 00:09:19.391 "bdev_get_histogram", 00:09:19.391 "bdev_enable_histogram", 00:09:19.391 "bdev_set_qos_limit", 00:09:19.391 "bdev_set_qd_sampling_period", 00:09:19.391 "bdev_get_bdevs", 00:09:19.391 "bdev_reset_iostat", 00:09:19.391 "bdev_get_iostat", 00:09:19.391 "bdev_examine", 00:09:19.391 "bdev_wait_for_examine", 00:09:19.391 "bdev_set_options", 00:09:19.391 "notify_get_notifications", 00:09:19.391 "notify_get_types", 00:09:19.391 "accel_get_stats", 00:09:19.391 "accel_set_options", 00:09:19.391 "accel_set_driver", 00:09:19.391 "accel_crypto_key_destroy", 00:09:19.391 "accel_crypto_keys_get", 00:09:19.391 "accel_crypto_key_create", 00:09:19.391 "accel_assign_opc", 00:09:19.391 "accel_get_module_info", 00:09:19.391 "accel_get_opc_assignments", 00:09:19.391 "vmd_rescan", 00:09:19.391 "vmd_remove_device", 00:09:19.391 "vmd_enable", 00:09:19.391 "sock_get_default_impl", 00:09:19.391 "sock_set_default_impl", 00:09:19.391 "sock_impl_set_options", 00:09:19.392 "sock_impl_get_options", 00:09:19.392 "iobuf_get_stats", 00:09:19.392 "iobuf_set_options", 00:09:19.392 "framework_get_pci_devices", 00:09:19.392 "framework_get_config", 00:09:19.392 "framework_get_subsystems", 00:09:19.392 "trace_get_info", 00:09:19.392 "trace_get_tpoint_group_mask", 00:09:19.392 "trace_disable_tpoint_group", 00:09:19.392 "trace_enable_tpoint_group", 00:09:19.392 "trace_clear_tpoint_mask", 00:09:19.392 "trace_set_tpoint_mask", 00:09:19.392 "keyring_get_keys", 00:09:19.392 "spdk_get_version", 00:09:19.392 "rpc_get_methods" 00:09:19.392 ] 00:09:19.654 05:37:34 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:09:19.654 05:37:34 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:09:19.654 05:37:34 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1094337 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 1094337 ']' 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 1094337 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1094337 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1094337' 00:09:19.654 killing process with pid 1094337 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 1094337 00:09:19.654 05:37:34 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 1094337 00:09:19.914 00:09:19.914 real 0m1.692s 00:09:19.914 user 0m2.944s 00:09:19.914 sys 0m0.579s 00:09:19.914 05:37:34 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:19.914 05:37:34 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:09:19.914 ************************************ 00:09:19.914 END TEST spdkcli_tcp 00:09:19.914 ************************************ 00:09:19.914 05:37:34 -- common/autotest_common.sh@1142 -- # return 0 00:09:19.914 05:37:34 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:09:19.914 05:37:34 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:19.914 05:37:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.914 05:37:34 -- common/autotest_common.sh@10 -- # set +x 00:09:20.173 ************************************ 00:09:20.173 START TEST dpdk_mem_utility 00:09:20.173 ************************************ 00:09:20.173 05:37:34 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:09:20.173 * Looking for test storage... 00:09:20.173 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:09:20.173 05:37:34 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:09:20.173 05:37:34 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1094597 00:09:20.173 05:37:34 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1094597 00:09:20.173 05:37:34 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 1094597 ']' 00:09:20.173 05:37:34 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:20.173 05:37:34 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:20.173 05:37:34 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:20.173 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:20.173 05:37:34 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:20.173 05:37:34 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:09:20.173 05:37:34 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:09:20.173 [2024-07-26 05:37:35.004693] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:20.173 [2024-07-26 05:37:35.004765] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1094597 ] 00:09:20.432 [2024-07-26 05:37:35.133214] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.432 [2024-07-26 05:37:35.241028] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.999 05:37:35 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:20.999 05:37:35 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:09:20.999 05:37:35 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:09:20.999 05:37:35 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:09:20.999 05:37:35 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:20.999 05:37:35 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:09:20.999 { 00:09:20.999 "filename": "/tmp/spdk_mem_dump.txt" 00:09:20.999 } 00:09:20.999 05:37:35 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:20.999 05:37:35 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:09:21.263 DPDK memory size 816.000000 MiB in 2 heap(s) 00:09:21.263 2 heaps totaling size 816.000000 MiB 00:09:21.263 size: 814.000000 MiB heap id: 0 00:09:21.263 size: 2.000000 MiB heap id: 1 00:09:21.263 end heaps---------- 00:09:21.263 8 mempools totaling size 598.116089 MiB 00:09:21.263 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:09:21.263 size: 158.602051 MiB name: PDU_data_out_Pool 00:09:21.263 size: 84.521057 MiB name: bdev_io_1094597 00:09:21.263 size: 51.011292 MiB name: evtpool_1094597 00:09:21.263 size: 50.003479 MiB name: msgpool_1094597 00:09:21.263 size: 21.763794 MiB name: PDU_Pool 00:09:21.263 size: 19.513306 MiB name: SCSI_TASK_Pool 00:09:21.263 size: 0.026123 MiB name: Session_Pool 00:09:21.263 end mempools------- 00:09:21.263 201 memzones totaling size 4.176453 MiB 00:09:21.263 size: 1.000366 MiB name: RG_ring_0_1094597 00:09:21.263 size: 1.000366 MiB name: RG_ring_1_1094597 00:09:21.263 size: 1.000366 MiB name: RG_ring_4_1094597 00:09:21.263 size: 1.000366 MiB name: RG_ring_5_1094597 00:09:21.263 size: 0.125366 MiB name: RG_ring_2_1094597 00:09:21.263 size: 0.015991 MiB name: RG_ring_3_1094597 00:09:21.263 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:09:21.263 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:09:21.263 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:01.0_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:01.1_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:01.2_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:01.3_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:01.4_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:01.5_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:01.6_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:01.7_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:02.0_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:02.1_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:02.2_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:02.3_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:02.4_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:02.5_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:02.6_qat 00:09:21.263 size: 0.000305 MiB name: 0000:da:02.7_qat 00:09:21.263 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:09:21.263 size: 0.000122 MiB name: rte_cryptodev_data_0 00:09:21.263 size: 0.000122 MiB name: rte_cryptodev_data_1 00:09:21.263 size: 0.000122 MiB name: rte_compressdev_data_0 00:09:21.263 size: 0.000122 MiB name: rte_cryptodev_data_2 00:09:21.263 size: 0.000122 MiB name: rte_cryptodev_data_3 00:09:21.263 size: 0.000122 MiB name: rte_compressdev_data_1 00:09:21.263 size: 0.000122 MiB name: rte_cryptodev_data_4 00:09:21.263 size: 0.000122 MiB name: rte_cryptodev_data_5 00:09:21.263 size: 0.000122 MiB name: rte_compressdev_data_2 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_6 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_7 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_3 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_8 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_9 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_4 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_10 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_11 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_5 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_12 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_13 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_6 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_14 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_15 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_7 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_16 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_17 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_8 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_18 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_19 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_9 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_20 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_21 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_10 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_22 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_23 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_11 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_24 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_25 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_12 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_26 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_27 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_13 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_28 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_29 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_14 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_30 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_31 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_15 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_32 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_33 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_16 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_34 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_35 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_17 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_36 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_37 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_18 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_38 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_39 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_19 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_40 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_41 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_20 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_42 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_43 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_21 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_44 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_45 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_22 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_46 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_47 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_23 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_48 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_49 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_24 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_50 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_51 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_25 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_52 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_53 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_26 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_54 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_55 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_27 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_56 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_57 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_28 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_58 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_59 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_29 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_60 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_61 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_30 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_62 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_63 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_31 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_64 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_65 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_32 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_66 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_67 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_33 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_68 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_69 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_34 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_70 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_71 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_35 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_72 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_73 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_36 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_74 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_75 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_37 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_76 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_77 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_38 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_78 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_79 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_39 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_80 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_81 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_40 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_82 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_83 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_41 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_84 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_85 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_42 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_86 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_87 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_43 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_88 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_89 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_44 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_90 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_91 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_45 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_92 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_93 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_46 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_94 00:09:21.264 size: 0.000122 MiB name: rte_cryptodev_data_95 00:09:21.264 size: 0.000122 MiB name: rte_compressdev_data_47 00:09:21.264 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:09:21.264 end memzones------- 00:09:21.264 05:37:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:09:21.264 heap id: 0 total size: 814.000000 MiB number of busy elements: 555 number of free elements: 14 00:09:21.264 list of free elements. size: 11.808228 MiB 00:09:21.264 element at address: 0x200000400000 with size: 1.999512 MiB 00:09:21.264 element at address: 0x200018e00000 with size: 0.999878 MiB 00:09:21.264 element at address: 0x200019000000 with size: 0.999878 MiB 00:09:21.265 element at address: 0x200003e00000 with size: 0.996460 MiB 00:09:21.265 element at address: 0x200031c00000 with size: 0.994446 MiB 00:09:21.265 element at address: 0x200013800000 with size: 0.978882 MiB 00:09:21.265 element at address: 0x200007000000 with size: 0.959839 MiB 00:09:21.265 element at address: 0x200019200000 with size: 0.937256 MiB 00:09:21.265 element at address: 0x20001aa00000 with size: 0.580322 MiB 00:09:21.265 element at address: 0x200003a00000 with size: 0.498535 MiB 00:09:21.265 element at address: 0x20000b200000 with size: 0.491272 MiB 00:09:21.265 element at address: 0x200000800000 with size: 0.486694 MiB 00:09:21.265 element at address: 0x200019400000 with size: 0.485840 MiB 00:09:21.265 element at address: 0x200027e00000 with size: 0.399414 MiB 00:09:21.265 list of standard malloc elements. size: 199.883484 MiB 00:09:21.265 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:09:21.265 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:09:21.265 element at address: 0x200018efff80 with size: 1.000122 MiB 00:09:21.265 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:09:21.265 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:09:21.265 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:09:21.265 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:09:21.265 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:09:21.265 element at address: 0x200000330b40 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000337640 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000033e140 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000344c40 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000034b740 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000352240 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000358d40 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000035f840 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000366880 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000036a340 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000036de00 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000375380 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000378e40 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000037c900 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000383e80 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000387940 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000038b400 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000392980 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000396440 with size: 0.004395 MiB 00:09:21.265 element at address: 0x200000399f00 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:09:21.265 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:09:21.265 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:09:21.265 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000333040 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000335540 with size: 0.004028 MiB 00:09:21.265 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000339b40 with size: 0.004028 MiB 00:09:21.265 element at address: 0x20000033c040 with size: 0.004028 MiB 00:09:21.265 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000340640 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000342b40 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000347140 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000349640 with size: 0.004028 MiB 00:09:21.265 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000350140 with size: 0.004028 MiB 00:09:21.265 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000354740 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000356c40 with size: 0.004028 MiB 00:09:21.265 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:09:21.265 element at address: 0x20000035b240 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000035d740 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000361d40 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000364780 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000365800 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000368240 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000370840 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000373280 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000374300 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000376d40 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000037a800 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000037b880 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000037f340 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000381d80 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000382e00 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000385840 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000389300 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000038a380 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000038de40 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000390880 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000391900 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000394340 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000397e00 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000398e80 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000039c940 with size: 0.004028 MiB 00:09:21.266 element at address: 0x20000039f380 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:09:21.266 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:09:21.266 element at address: 0x200000204d40 with size: 0.000305 MiB 00:09:21.266 element at address: 0x200000200000 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200180 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200240 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200300 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200480 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200540 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200600 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200780 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200840 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200900 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200a80 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200b40 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200c00 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200d80 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200e40 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200f00 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201080 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201140 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201200 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201380 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201440 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201500 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201680 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201740 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201800 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201980 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201a40 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201b00 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201c80 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201d40 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201e00 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000201f80 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202040 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202100 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202280 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202340 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202400 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202580 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202640 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202700 with size: 0.000183 MiB 00:09:21.266 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202880 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202940 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202a00 with size: 0.000183 MiB 00:09:21.266 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000202b80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000202c40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000202d00 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000202e80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000202f40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203000 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203180 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203240 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203300 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203480 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203540 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203600 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203780 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203840 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203900 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203a80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203b40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203c00 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203d80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203e40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203f00 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204080 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204140 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204200 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204380 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204440 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204500 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204680 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204740 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204800 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204980 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204a40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204b00 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204c80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204e80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000204f40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205000 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205180 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205240 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205300 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205480 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205540 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205600 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205780 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205840 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205900 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205a80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205b40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205c00 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205d80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205e40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205f00 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000206080 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000206140 with size: 0.000183 MiB 00:09:21.267 element at address: 0x200000206200 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000020a780 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022af80 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b040 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b100 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b280 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b340 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b400 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b580 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b640 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b700 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b900 with size: 0.000183 MiB 00:09:21.267 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022be40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022c080 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022c140 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022c200 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022c380 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022c440 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000022c500 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000032e700 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000331d40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000338840 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000033f340 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000345e40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000034c940 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000353440 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000359f40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000360a40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000364180 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000364240 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000364400 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000367a80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000367c40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000367d00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000036b540 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000036b700 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000036b980 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000036f000 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000036f280 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000036f440 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000372c80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000372d40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000372f00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000376580 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000376740 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000376800 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000037a040 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000037a200 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000037a480 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000037db00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000037df40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000381780 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000381840 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000381a00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000385080 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000385240 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000385300 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000388b40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000388d00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000388f80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000038c600 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000038c880 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000390280 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000390340 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000390500 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000393b80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000393d40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000393e00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000397640 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000397800 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x200000397a80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000039b100 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000039b380 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000039b540 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x20000039f000 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:09:21.268 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20000087c980 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:09:21.269 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:09:21.269 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e66400 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e664c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d0c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:09:21.269 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:09:21.270 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:09:21.271 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:09:21.271 list of memzone associated elements. size: 602.308289 MiB 00:09:21.271 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:09:21.271 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:09:21.271 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:09:21.271 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:09:21.271 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:09:21.271 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1094597_0 00:09:21.271 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:09:21.271 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1094597_0 00:09:21.271 element at address: 0x200003fff380 with size: 48.003052 MiB 00:09:21.271 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1094597_0 00:09:21.271 element at address: 0x2000195be940 with size: 20.255554 MiB 00:09:21.271 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:09:21.271 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:09:21.271 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:09:21.271 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:09:21.271 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1094597 00:09:21.271 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:09:21.271 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1094597 00:09:21.271 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:09:21.271 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1094597 00:09:21.271 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:09:21.271 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:09:21.271 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:09:21.271 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:09:21.271 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:09:21.271 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:09:21.271 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:09:21.271 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:09:21.271 element at address: 0x200003eff180 with size: 1.000488 MiB 00:09:21.271 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1094597 00:09:21.271 element at address: 0x200003affc00 with size: 1.000488 MiB 00:09:21.271 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1094597 00:09:21.271 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:09:21.271 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1094597 00:09:21.271 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:09:21.271 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1094597 00:09:21.271 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:09:21.271 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1094597 00:09:21.271 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:09:21.271 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:09:21.271 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:09:21.271 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:09:21.271 element at address: 0x20001947c600 with size: 0.250488 MiB 00:09:21.271 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:09:21.271 element at address: 0x20000020a840 with size: 0.125488 MiB 00:09:21.271 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1094597 00:09:21.271 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:09:21.271 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:09:21.271 element at address: 0x200027e66580 with size: 0.023743 MiB 00:09:21.271 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:09:21.271 element at address: 0x200000206580 with size: 0.016113 MiB 00:09:21.271 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1094597 00:09:21.271 element at address: 0x200027e6c6c0 with size: 0.002441 MiB 00:09:21.271 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:09:21.271 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:09:21.271 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:09:21.271 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:09:21.271 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:09:21.271 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:09:21.271 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:09:21.271 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:09:21.271 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:09:21.271 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:09:21.271 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:09:21.271 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:09:21.271 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:09:21.271 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:09:21.271 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:09:21.271 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:09:21.271 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:09:21.271 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:09:21.271 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:09:21.271 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:09:21.272 element at address: 0x20000039b700 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:09:21.272 element at address: 0x200000397c40 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:09:21.272 element at address: 0x200000394180 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:09:21.272 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:09:21.272 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:09:21.272 element at address: 0x200000389140 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:09:21.272 element at address: 0x200000385680 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:09:21.272 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:09:21.272 element at address: 0x20000037e100 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:09:21.272 element at address: 0x20000037a640 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:09:21.272 element at address: 0x200000376b80 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:09:21.272 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:09:21.272 element at address: 0x20000036f600 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:09:21.272 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:09:21.272 element at address: 0x200000368080 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:09:21.272 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:09:21.272 element at address: 0x200000360b00 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:09:21.272 element at address: 0x20000035d580 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:09:21.272 element at address: 0x20000035a000 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:09:21.272 element at address: 0x200000356a80 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:09:21.272 element at address: 0x200000353500 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:09:21.272 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:09:21.272 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:09:21.272 element at address: 0x200000349480 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:09:21.272 element at address: 0x200000345f00 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:09:21.272 element at address: 0x200000342980 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:09:21.272 element at address: 0x20000033f400 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:09:21.272 element at address: 0x20000033be80 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:09:21.272 element at address: 0x200000338900 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:09:21.272 element at address: 0x200000335380 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:09:21.272 element at address: 0x200000331e00 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:09:21.272 element at address: 0x20000032e880 with size: 0.000427 MiB 00:09:21.272 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:09:21.272 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:09:21.272 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:09:21.272 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:09:21.272 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1094597 00:09:21.272 element at address: 0x200000206380 with size: 0.000305 MiB 00:09:21.272 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1094597 00:09:21.272 element at address: 0x200027e6d180 with size: 0.000305 MiB 00:09:21.272 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:09:21.272 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:09:21.272 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:09:21.272 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:09:21.272 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:09:21.272 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:09:21.272 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:09:21.272 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:09:21.272 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:09:21.272 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:09:21.272 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:09:21.272 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:09:21.272 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:09:21.272 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:09:21.272 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:09:21.272 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:09:21.272 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:09:21.272 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:09:21.272 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:09:21.272 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:09:21.272 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:09:21.272 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:09:21.272 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:09:21.272 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:09:21.272 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:09:21.272 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:09:21.273 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:09:21.273 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:09:21.273 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:09:21.273 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:09:21.273 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:09:21.273 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:09:21.273 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:09:21.273 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:09:21.273 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:09:21.273 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:09:21.273 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:09:21.273 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:09:21.273 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:09:21.273 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:09:21.273 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:09:21.273 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:09:21.273 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:09:21.273 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:09:21.273 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:09:21.273 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:09:21.273 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:09:21.273 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:09:21.273 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:09:21.273 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:09:21.273 element at address: 0x20000039b600 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:09:21.273 element at address: 0x20000039b440 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:09:21.273 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:09:21.273 element at address: 0x200000397b40 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:09:21.273 element at address: 0x200000397980 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:09:21.273 element at address: 0x200000397700 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:09:21.273 element at address: 0x200000394080 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:09:21.273 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:09:21.273 element at address: 0x200000393c40 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:09:21.273 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:09:21.273 element at address: 0x200000390400 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:09:21.273 element at address: 0x200000390180 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:09:21.273 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:09:21.273 element at address: 0x20000038c940 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:09:21.273 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:09:21.273 element at address: 0x200000389040 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:09:21.273 element at address: 0x200000388e80 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:09:21.273 element at address: 0x200000388c00 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:09:21.273 element at address: 0x200000385580 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:09:21.273 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:09:21.273 element at address: 0x200000385140 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:09:21.273 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:09:21.273 element at address: 0x200000381900 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:09:21.273 element at address: 0x200000381680 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:09:21.273 element at address: 0x20000037e000 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:09:21.273 element at address: 0x20000037de40 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:09:21.273 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:09:21.273 element at address: 0x20000037a540 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:09:21.273 element at address: 0x20000037a380 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:09:21.273 element at address: 0x20000037a100 with size: 0.000244 MiB 00:09:21.273 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:09:21.273 element at address: 0x200000376a80 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:09:21.274 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:09:21.274 element at address: 0x200000376640 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:09:21.274 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:09:21.274 element at address: 0x200000372e00 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:09:21.274 element at address: 0x200000372b80 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:09:21.274 element at address: 0x20000036f500 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:09:21.274 element at address: 0x20000036f340 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:09:21.274 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:09:21.274 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:09:21.274 element at address: 0x20000036b880 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:09:21.274 element at address: 0x20000036b600 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:09:21.274 element at address: 0x200000367f80 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:09:21.274 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:09:21.274 element at address: 0x200000367b40 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:09:21.274 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:09:21.274 element at address: 0x200000364300 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:09:21.274 element at address: 0x200000364080 with size: 0.000244 MiB 00:09:21.274 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:09:21.274 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:09:21.274 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:09:21.274 05:37:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:09:21.274 05:37:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1094597 00:09:21.274 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 1094597 ']' 00:09:21.274 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 1094597 00:09:21.274 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:09:21.274 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:21.274 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1094597 00:09:21.533 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:21.533 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:21.533 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1094597' 00:09:21.533 killing process with pid 1094597 00:09:21.533 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 1094597 00:09:21.533 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 1094597 00:09:21.792 00:09:21.792 real 0m1.742s 00:09:21.792 user 0m1.919s 00:09:21.792 sys 0m0.529s 00:09:21.792 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:21.792 05:37:36 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:09:21.792 ************************************ 00:09:21.792 END TEST dpdk_mem_utility 00:09:21.792 ************************************ 00:09:21.792 05:37:36 -- common/autotest_common.sh@1142 -- # return 0 00:09:21.792 05:37:36 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:09:21.792 05:37:36 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:21.792 05:37:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:21.792 05:37:36 -- common/autotest_common.sh@10 -- # set +x 00:09:21.792 ************************************ 00:09:21.792 START TEST event 00:09:21.792 ************************************ 00:09:21.792 05:37:36 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:09:22.050 * Looking for test storage... 00:09:22.050 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:09:22.050 05:37:36 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:22.050 05:37:36 event -- bdev/nbd_common.sh@6 -- # set -e 00:09:22.050 05:37:36 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:09:22.050 05:37:36 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:22.050 05:37:36 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:22.050 05:37:36 event -- common/autotest_common.sh@10 -- # set +x 00:09:22.050 ************************************ 00:09:22.050 START TEST event_perf 00:09:22.050 ************************************ 00:09:22.050 05:37:36 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:09:22.050 Running I/O for 1 seconds...[2024-07-26 05:37:36.817955] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:22.050 [2024-07-26 05:37:36.818023] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1094984 ] 00:09:22.050 [2024-07-26 05:37:36.948239] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:22.309 [2024-07-26 05:37:37.053826] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:22.309 [2024-07-26 05:37:37.053913] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:22.309 [2024-07-26 05:37:37.053989] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:22.309 [2024-07-26 05:37:37.053992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.243 Running I/O for 1 seconds... 00:09:23.243 lcore 0: 175724 00:09:23.243 lcore 1: 175723 00:09:23.243 lcore 2: 175722 00:09:23.243 lcore 3: 175724 00:09:23.243 done. 00:09:23.502 00:09:23.502 real 0m1.362s 00:09:23.502 user 0m4.204s 00:09:23.502 sys 0m0.151s 00:09:23.502 05:37:38 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:23.502 05:37:38 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:09:23.502 ************************************ 00:09:23.502 END TEST event_perf 00:09:23.502 ************************************ 00:09:23.502 05:37:38 event -- common/autotest_common.sh@1142 -- # return 0 00:09:23.502 05:37:38 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:09:23.502 05:37:38 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:09:23.502 05:37:38 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:23.502 05:37:38 event -- common/autotest_common.sh@10 -- # set +x 00:09:23.502 ************************************ 00:09:23.502 START TEST event_reactor 00:09:23.502 ************************************ 00:09:23.502 05:37:38 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:09:23.502 [2024-07-26 05:37:38.268972] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:23.502 [2024-07-26 05:37:38.269038] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1095187 ] 00:09:23.502 [2024-07-26 05:37:38.400003] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:23.761 [2024-07-26 05:37:38.501364] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:24.698 test_start 00:09:24.698 oneshot 00:09:24.698 tick 100 00:09:24.698 tick 100 00:09:24.698 tick 250 00:09:24.698 tick 100 00:09:24.698 tick 100 00:09:24.698 tick 250 00:09:24.698 tick 100 00:09:24.698 tick 500 00:09:24.698 tick 100 00:09:24.698 tick 100 00:09:24.698 tick 250 00:09:24.698 tick 100 00:09:24.698 tick 100 00:09:24.698 test_end 00:09:24.698 00:09:24.698 real 0m1.352s 00:09:24.698 user 0m1.212s 00:09:24.698 sys 0m0.133s 00:09:24.698 05:37:39 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:24.698 05:37:39 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:09:24.698 ************************************ 00:09:24.698 END TEST event_reactor 00:09:24.698 ************************************ 00:09:24.957 05:37:39 event -- common/autotest_common.sh@1142 -- # return 0 00:09:24.957 05:37:39 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:09:24.957 05:37:39 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:09:24.957 05:37:39 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:24.957 05:37:39 event -- common/autotest_common.sh@10 -- # set +x 00:09:24.957 ************************************ 00:09:24.957 START TEST event_reactor_perf 00:09:24.957 ************************************ 00:09:24.957 05:37:39 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:09:24.957 [2024-07-26 05:37:39.705232] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:24.957 [2024-07-26 05:37:39.705289] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1095386 ] 00:09:24.957 [2024-07-26 05:37:39.831885] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:25.216 [2024-07-26 05:37:39.933479] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.151 test_start 00:09:26.151 test_end 00:09:26.151 Performance: 325946 events per second 00:09:26.151 00:09:26.151 real 0m1.344s 00:09:26.151 user 0m1.199s 00:09:26.151 sys 0m0.139s 00:09:26.151 05:37:41 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:26.151 05:37:41 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:09:26.151 ************************************ 00:09:26.151 END TEST event_reactor_perf 00:09:26.151 ************************************ 00:09:26.410 05:37:41 event -- common/autotest_common.sh@1142 -- # return 0 00:09:26.410 05:37:41 event -- event/event.sh@49 -- # uname -s 00:09:26.410 05:37:41 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:09:26.410 05:37:41 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:09:26.410 05:37:41 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:26.410 05:37:41 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:26.410 05:37:41 event -- common/autotest_common.sh@10 -- # set +x 00:09:26.410 ************************************ 00:09:26.410 START TEST event_scheduler 00:09:26.410 ************************************ 00:09:26.410 05:37:41 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:09:26.410 * Looking for test storage... 00:09:26.410 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:09:26.410 05:37:41 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:09:26.410 05:37:41 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1095613 00:09:26.410 05:37:41 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:09:26.410 05:37:41 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:09:26.410 05:37:41 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1095613 00:09:26.410 05:37:41 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 1095613 ']' 00:09:26.410 05:37:41 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:26.410 05:37:41 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:26.410 05:37:41 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:26.410 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:26.410 05:37:41 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:26.410 05:37:41 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:09:26.410 [2024-07-26 05:37:41.256129] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:26.410 [2024-07-26 05:37:41.256195] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1095613 ] 00:09:26.669 [2024-07-26 05:37:41.360511] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:26.669 [2024-07-26 05:37:41.452520] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.669 [2024-07-26 05:37:41.452542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:26.669 [2024-07-26 05:37:41.452618] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:26.669 [2024-07-26 05:37:41.452620] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:09:27.606 05:37:42 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:09:27.606 [2024-07-26 05:37:42.211459] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:09:27.606 [2024-07-26 05:37:42.211479] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:09:27.606 [2024-07-26 05:37:42.211490] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:09:27.606 [2024-07-26 05:37:42.211498] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:09:27.606 [2024-07-26 05:37:42.211505] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.606 05:37:42 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:09:27.606 [2024-07-26 05:37:42.293784] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.606 05:37:42 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:27.606 05:37:42 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:09:27.606 ************************************ 00:09:27.606 START TEST scheduler_create_thread 00:09:27.606 ************************************ 00:09:27.606 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:09:27.606 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 2 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 3 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 4 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 5 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 6 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 7 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 8 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 9 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 10 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.607 05:37:42 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:28.983 05:37:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:28.983 05:37:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:09:28.983 05:37:43 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:09:29.241 05:37:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:29.241 05:37:43 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:30.178 05:37:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:30.178 00:09:30.178 real 0m2.620s 00:09:30.178 user 0m0.025s 00:09:30.178 sys 0m0.005s 00:09:30.178 05:37:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:30.178 05:37:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:09:30.178 ************************************ 00:09:30.178 END TEST scheduler_create_thread 00:09:30.178 ************************************ 00:09:30.178 05:37:44 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:09:30.178 05:37:44 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:09:30.178 05:37:44 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1095613 00:09:30.178 05:37:44 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 1095613 ']' 00:09:30.178 05:37:44 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 1095613 00:09:30.178 05:37:44 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:09:30.178 05:37:45 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:30.178 05:37:45 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1095613 00:09:30.178 05:37:45 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:09:30.178 05:37:45 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:09:30.178 05:37:45 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1095613' 00:09:30.178 killing process with pid 1095613 00:09:30.178 05:37:45 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 1095613 00:09:30.178 05:37:45 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 1095613 00:09:30.750 [2024-07-26 05:37:45.432293] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:09:31.009 00:09:31.009 real 0m4.562s 00:09:31.009 user 0m8.742s 00:09:31.009 sys 0m0.495s 00:09:31.009 05:37:45 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:31.009 05:37:45 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:09:31.009 ************************************ 00:09:31.009 END TEST event_scheduler 00:09:31.009 ************************************ 00:09:31.009 05:37:45 event -- common/autotest_common.sh@1142 -- # return 0 00:09:31.009 05:37:45 event -- event/event.sh@51 -- # modprobe -n nbd 00:09:31.009 05:37:45 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:09:31.009 05:37:45 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:31.009 05:37:45 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:31.009 05:37:45 event -- common/autotest_common.sh@10 -- # set +x 00:09:31.009 ************************************ 00:09:31.009 START TEST app_repeat 00:09:31.009 ************************************ 00:09:31.009 05:37:45 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1096191 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1096191' 00:09:31.009 Process app_repeat pid: 1096191 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:09:31.009 spdk_app_start Round 0 00:09:31.009 05:37:45 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1096191 /var/tmp/spdk-nbd.sock 00:09:31.009 05:37:45 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1096191 ']' 00:09:31.009 05:37:45 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:31.009 05:37:45 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:31.009 05:37:45 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:31.009 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:31.009 05:37:45 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:31.009 05:37:45 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:31.009 [2024-07-26 05:37:45.798500] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:31.009 [2024-07-26 05:37:45.798572] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1096191 ] 00:09:31.268 [2024-07-26 05:37:45.931948] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:31.268 [2024-07-26 05:37:46.036742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:31.268 [2024-07-26 05:37:46.036747] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:32.204 05:37:46 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:32.204 05:37:46 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:09:32.204 05:37:46 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:09:32.204 Malloc0 00:09:32.204 05:37:47 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:09:32.463 Malloc1 00:09:32.463 05:37:47 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:32.463 05:37:47 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:32.722 /dev/nbd0 00:09:32.722 05:37:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:32.722 05:37:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:09:32.722 1+0 records in 00:09:32.722 1+0 records out 00:09:32.722 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258397 s, 15.9 MB/s 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:32.722 05:37:47 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:09:32.722 05:37:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:32.722 05:37:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:32.722 05:37:47 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:09:32.980 /dev/nbd1 00:09:32.980 05:37:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:32.980 05:37:47 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:09:32.980 1+0 records in 00:09:32.980 1+0 records out 00:09:32.980 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275863 s, 14.8 MB/s 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:32.980 05:37:47 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:09:32.980 05:37:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:32.980 05:37:47 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:32.980 05:37:47 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:32.980 05:37:47 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:32.980 05:37:47 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:33.240 { 00:09:33.240 "nbd_device": "/dev/nbd0", 00:09:33.240 "bdev_name": "Malloc0" 00:09:33.240 }, 00:09:33.240 { 00:09:33.240 "nbd_device": "/dev/nbd1", 00:09:33.240 "bdev_name": "Malloc1" 00:09:33.240 } 00:09:33.240 ]' 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:33.240 { 00:09:33.240 "nbd_device": "/dev/nbd0", 00:09:33.240 "bdev_name": "Malloc0" 00:09:33.240 }, 00:09:33.240 { 00:09:33.240 "nbd_device": "/dev/nbd1", 00:09:33.240 "bdev_name": "Malloc1" 00:09:33.240 } 00:09:33.240 ]' 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:33.240 /dev/nbd1' 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:33.240 /dev/nbd1' 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:09:33.240 256+0 records in 00:09:33.240 256+0 records out 00:09:33.240 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115057 s, 91.1 MB/s 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:33.240 256+0 records in 00:09:33.240 256+0 records out 00:09:33.240 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0181209 s, 57.9 MB/s 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:33.240 256+0 records in 00:09:33.240 256+0 records out 00:09:33.240 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0290159 s, 36.1 MB/s 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:33.240 05:37:48 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:09:33.498 05:37:48 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:33.498 05:37:48 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:09:33.498 05:37:48 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:33.498 05:37:48 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:33.498 05:37:48 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:33.498 05:37:48 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:09:33.498 05:37:48 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:33.498 05:37:48 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:33.756 05:37:48 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:34.015 05:37:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:34.276 05:37:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:09:34.276 05:37:48 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:09:34.276 05:37:48 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:34.276 05:37:48 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:09:34.276 05:37:48 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:34.276 05:37:48 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:09:34.276 05:37:48 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:09:34.600 05:37:49 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:09:34.600 [2024-07-26 05:37:49.430888] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:34.868 [2024-07-26 05:37:49.531028] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:34.868 [2024-07-26 05:37:49.531034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:34.868 [2024-07-26 05:37:49.583285] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:09:34.868 [2024-07-26 05:37:49.583340] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:09:37.400 05:37:52 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:09:37.400 05:37:52 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:09:37.400 spdk_app_start Round 1 00:09:37.400 05:37:52 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1096191 /var/tmp/spdk-nbd.sock 00:09:37.400 05:37:52 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1096191 ']' 00:09:37.400 05:37:52 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:37.400 05:37:52 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:37.400 05:37:52 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:37.400 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:37.400 05:37:52 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:37.400 05:37:52 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:37.658 05:37:52 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:37.658 05:37:52 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:09:37.658 05:37:52 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:09:37.917 Malloc0 00:09:37.917 05:37:52 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:09:38.176 Malloc1 00:09:38.176 05:37:52 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:38.176 05:37:52 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:38.434 /dev/nbd0 00:09:38.434 05:37:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:38.434 05:37:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:09:38.434 1+0 records in 00:09:38.434 1+0 records out 00:09:38.434 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234358 s, 17.5 MB/s 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:38.434 05:37:53 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:09:38.434 05:37:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:38.434 05:37:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:38.434 05:37:53 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:09:38.692 /dev/nbd1 00:09:38.692 05:37:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:38.692 05:37:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:09:38.692 1+0 records in 00:09:38.692 1+0 records out 00:09:38.692 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269122 s, 15.2 MB/s 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:38.692 05:37:53 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:09:38.692 05:37:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:38.692 05:37:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:38.692 05:37:53 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:38.692 05:37:53 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:38.692 05:37:53 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:38.950 { 00:09:38.950 "nbd_device": "/dev/nbd0", 00:09:38.950 "bdev_name": "Malloc0" 00:09:38.950 }, 00:09:38.950 { 00:09:38.950 "nbd_device": "/dev/nbd1", 00:09:38.950 "bdev_name": "Malloc1" 00:09:38.950 } 00:09:38.950 ]' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:38.950 { 00:09:38.950 "nbd_device": "/dev/nbd0", 00:09:38.950 "bdev_name": "Malloc0" 00:09:38.950 }, 00:09:38.950 { 00:09:38.950 "nbd_device": "/dev/nbd1", 00:09:38.950 "bdev_name": "Malloc1" 00:09:38.950 } 00:09:38.950 ]' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:38.950 /dev/nbd1' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:38.950 /dev/nbd1' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:09:38.950 256+0 records in 00:09:38.950 256+0 records out 00:09:38.950 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011412 s, 91.9 MB/s 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:38.950 256+0 records in 00:09:38.950 256+0 records out 00:09:38.950 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0183849 s, 57.0 MB/s 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:38.950 256+0 records in 00:09:38.950 256+0 records out 00:09:38.950 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0202598 s, 51.8 MB/s 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:09:38.950 05:37:53 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:38.951 05:37:53 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:39.210 05:37:53 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:39.468 05:37:54 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:39.468 05:37:54 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:39.468 05:37:54 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:39.468 05:37:54 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:39.468 05:37:54 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:39.469 05:37:54 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:39.469 05:37:54 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:09:39.469 05:37:54 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:09:39.469 05:37:54 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:39.469 05:37:54 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:39.469 05:37:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:39.728 05:37:54 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:09:39.728 05:37:54 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:09:39.987 05:37:54 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:09:40.246 [2024-07-26 05:37:55.032221] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:40.246 [2024-07-26 05:37:55.127405] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:40.246 [2024-07-26 05:37:55.127409] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:40.505 [2024-07-26 05:37:55.176219] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:09:40.505 [2024-07-26 05:37:55.176277] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:09:43.040 05:37:57 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:09:43.040 05:37:57 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:09:43.040 spdk_app_start Round 2 00:09:43.040 05:37:57 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1096191 /var/tmp/spdk-nbd.sock 00:09:43.040 05:37:57 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1096191 ']' 00:09:43.040 05:37:57 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:43.040 05:37:57 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:43.040 05:37:57 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:43.040 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:43.040 05:37:57 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:43.040 05:37:57 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:43.299 05:37:58 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:43.299 05:37:58 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:09:43.299 05:37:58 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:09:43.558 Malloc0 00:09:43.558 05:37:58 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:09:43.817 Malloc1 00:09:43.817 05:37:58 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:43.817 05:37:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:44.078 /dev/nbd0 00:09:44.078 05:37:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:44.078 05:37:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:09:44.078 1+0 records in 00:09:44.078 1+0 records out 00:09:44.078 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244434 s, 16.8 MB/s 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:44.078 05:37:58 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:09:44.078 05:37:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:44.078 05:37:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:44.078 05:37:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:09:44.338 /dev/nbd1 00:09:44.338 05:37:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:44.338 05:37:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:09:44.338 1+0 records in 00:09:44.338 1+0 records out 00:09:44.338 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244183 s, 16.8 MB/s 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:44.338 05:37:59 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:09:44.338 05:37:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:44.338 05:37:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:09:44.338 05:37:59 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:44.338 05:37:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:44.338 05:37:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:44.597 { 00:09:44.597 "nbd_device": "/dev/nbd0", 00:09:44.597 "bdev_name": "Malloc0" 00:09:44.597 }, 00:09:44.597 { 00:09:44.597 "nbd_device": "/dev/nbd1", 00:09:44.597 "bdev_name": "Malloc1" 00:09:44.597 } 00:09:44.597 ]' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:44.597 { 00:09:44.597 "nbd_device": "/dev/nbd0", 00:09:44.597 "bdev_name": "Malloc0" 00:09:44.597 }, 00:09:44.597 { 00:09:44.597 "nbd_device": "/dev/nbd1", 00:09:44.597 "bdev_name": "Malloc1" 00:09:44.597 } 00:09:44.597 ]' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:44.597 /dev/nbd1' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:44.597 /dev/nbd1' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:09:44.597 256+0 records in 00:09:44.597 256+0 records out 00:09:44.597 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114658 s, 91.5 MB/s 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:44.597 256+0 records in 00:09:44.597 256+0 records out 00:09:44.597 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0220501 s, 47.6 MB/s 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:44.597 256+0 records in 00:09:44.597 256+0 records out 00:09:44.597 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0208195 s, 50.4 MB/s 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:44.597 05:37:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:44.857 05:37:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:45.116 05:37:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:45.374 05:38:00 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:09:45.374 05:38:00 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:09:45.633 05:38:00 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:09:45.892 [2024-07-26 05:38:00.705899] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:46.151 [2024-07-26 05:38:00.808254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:46.151 [2024-07-26 05:38:00.808258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:46.151 [2024-07-26 05:38:00.860918] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:09:46.151 [2024-07-26 05:38:00.860971] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:09:48.685 05:38:03 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1096191 /var/tmp/spdk-nbd.sock 00:09:48.685 05:38:03 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1096191 ']' 00:09:48.685 05:38:03 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:48.685 05:38:03 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:48.685 05:38:03 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:48.685 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:48.685 05:38:03 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:48.685 05:38:03 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:09:48.944 05:38:03 event.app_repeat -- event/event.sh@39 -- # killprocess 1096191 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 1096191 ']' 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 1096191 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1096191 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1096191' 00:09:48.944 killing process with pid 1096191 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@967 -- # kill 1096191 00:09:48.944 05:38:03 event.app_repeat -- common/autotest_common.sh@972 -- # wait 1096191 00:09:49.203 spdk_app_start is called in Round 0. 00:09:49.203 Shutdown signal received, stop current app iteration 00:09:49.203 Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 reinitialization... 00:09:49.203 spdk_app_start is called in Round 1. 00:09:49.203 Shutdown signal received, stop current app iteration 00:09:49.203 Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 reinitialization... 00:09:49.203 spdk_app_start is called in Round 2. 00:09:49.203 Shutdown signal received, stop current app iteration 00:09:49.203 Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 reinitialization... 00:09:49.203 spdk_app_start is called in Round 3. 00:09:49.203 Shutdown signal received, stop current app iteration 00:09:49.203 05:38:03 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:09:49.203 05:38:03 event.app_repeat -- event/event.sh@42 -- # return 0 00:09:49.203 00:09:49.203 real 0m18.206s 00:09:49.203 user 0m39.267s 00:09:49.203 sys 0m3.675s 00:09:49.203 05:38:03 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:49.203 05:38:03 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:49.203 ************************************ 00:09:49.203 END TEST app_repeat 00:09:49.203 ************************************ 00:09:49.203 05:38:04 event -- common/autotest_common.sh@1142 -- # return 0 00:09:49.203 05:38:04 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:09:49.203 00:09:49.203 real 0m27.360s 00:09:49.203 user 0m54.820s 00:09:49.203 sys 0m4.969s 00:09:49.203 05:38:04 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:49.203 05:38:04 event -- common/autotest_common.sh@10 -- # set +x 00:09:49.203 ************************************ 00:09:49.203 END TEST event 00:09:49.203 ************************************ 00:09:49.203 05:38:04 -- common/autotest_common.sh@1142 -- # return 0 00:09:49.203 05:38:04 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:09:49.203 05:38:04 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:49.203 05:38:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:49.203 05:38:04 -- common/autotest_common.sh@10 -- # set +x 00:09:49.203 ************************************ 00:09:49.203 START TEST thread 00:09:49.203 ************************************ 00:09:49.203 05:38:04 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:09:49.462 * Looking for test storage... 00:09:49.462 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:09:49.462 05:38:04 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:09:49.462 05:38:04 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:49.462 05:38:04 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:49.462 05:38:04 thread -- common/autotest_common.sh@10 -- # set +x 00:09:49.462 ************************************ 00:09:49.462 START TEST thread_poller_perf 00:09:49.462 ************************************ 00:09:49.462 05:38:04 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:09:49.462 [2024-07-26 05:38:04.241737] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:49.462 [2024-07-26 05:38:04.241804] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1098886 ] 00:09:49.462 [2024-07-26 05:38:04.355988] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:49.721 [2024-07-26 05:38:04.458151] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:49.721 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:09:50.658 ====================================== 00:09:50.658 busy:2308548756 (cyc) 00:09:50.658 total_run_count: 266000 00:09:50.658 tsc_hz: 2300000000 (cyc) 00:09:50.658 ====================================== 00:09:50.658 poller_cost: 8678 (cyc), 3773 (nsec) 00:09:50.658 00:09:50.658 real 0m1.347s 00:09:50.658 user 0m1.200s 00:09:50.658 sys 0m0.139s 00:09:50.658 05:38:05 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:50.658 05:38:05 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:09:50.658 ************************************ 00:09:50.658 END TEST thread_poller_perf 00:09:50.658 ************************************ 00:09:50.917 05:38:05 thread -- common/autotest_common.sh@1142 -- # return 0 00:09:50.917 05:38:05 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:09:50.917 05:38:05 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:50.917 05:38:05 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:50.917 05:38:05 thread -- common/autotest_common.sh@10 -- # set +x 00:09:50.917 ************************************ 00:09:50.917 START TEST thread_poller_perf 00:09:50.917 ************************************ 00:09:50.917 05:38:05 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:09:50.917 [2024-07-26 05:38:05.677780] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:50.917 [2024-07-26 05:38:05.677861] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1099088 ] 00:09:50.917 [2024-07-26 05:38:05.792695] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:51.175 [2024-07-26 05:38:05.897461] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:51.175 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:09:52.149 ====================================== 00:09:52.149 busy:2302308432 (cyc) 00:09:52.149 total_run_count: 3491000 00:09:52.149 tsc_hz: 2300000000 (cyc) 00:09:52.149 ====================================== 00:09:52.149 poller_cost: 659 (cyc), 286 (nsec) 00:09:52.149 00:09:52.149 real 0m1.346s 00:09:52.149 user 0m1.204s 00:09:52.149 sys 0m0.136s 00:09:52.149 05:38:06 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:52.149 05:38:06 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:09:52.149 ************************************ 00:09:52.149 END TEST thread_poller_perf 00:09:52.149 ************************************ 00:09:52.149 05:38:07 thread -- common/autotest_common.sh@1142 -- # return 0 00:09:52.149 05:38:07 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:09:52.149 00:09:52.149 real 0m2.967s 00:09:52.149 user 0m2.504s 00:09:52.149 sys 0m0.472s 00:09:52.149 05:38:07 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:52.149 05:38:07 thread -- common/autotest_common.sh@10 -- # set +x 00:09:52.149 ************************************ 00:09:52.149 END TEST thread 00:09:52.149 ************************************ 00:09:52.408 05:38:07 -- common/autotest_common.sh@1142 -- # return 0 00:09:52.408 05:38:07 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:09:52.408 05:38:07 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:52.408 05:38:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.408 05:38:07 -- common/autotest_common.sh@10 -- # set +x 00:09:52.408 ************************************ 00:09:52.408 START TEST accel 00:09:52.408 ************************************ 00:09:52.408 05:38:07 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:09:52.408 * Looking for test storage... 00:09:52.408 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:09:52.408 05:38:07 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:09:52.408 05:38:07 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:09:52.408 05:38:07 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:52.408 05:38:07 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1099323 00:09:52.408 05:38:07 accel -- accel/accel.sh@63 -- # waitforlisten 1099323 00:09:52.408 05:38:07 accel -- common/autotest_common.sh@829 -- # '[' -z 1099323 ']' 00:09:52.408 05:38:07 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:52.408 05:38:07 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:52.408 05:38:07 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:52.408 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:52.408 05:38:07 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:52.408 05:38:07 accel -- common/autotest_common.sh@10 -- # set +x 00:09:52.408 05:38:07 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:09:52.408 05:38:07 accel -- accel/accel.sh@61 -- # build_accel_config 00:09:52.408 05:38:07 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:52.408 05:38:07 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:52.408 05:38:07 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:52.408 05:38:07 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:52.408 05:38:07 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:52.408 05:38:07 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:52.408 05:38:07 accel -- accel/accel.sh@41 -- # jq -r . 00:09:52.666 [2024-07-26 05:38:07.361655] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:52.666 [2024-07-26 05:38:07.361792] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1099323 ] 00:09:52.666 [2024-07-26 05:38:07.556904] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:52.925 [2024-07-26 05:38:07.659392] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:53.491 05:38:08 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:53.491 05:38:08 accel -- common/autotest_common.sh@862 -- # return 0 00:09:53.491 05:38:08 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:09:53.491 05:38:08 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:09:53.491 05:38:08 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:09:53.491 05:38:08 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:09:53.491 05:38:08 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:09:53.491 05:38:08 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:09:53.491 05:38:08 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.491 05:38:08 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:09:53.491 05:38:08 accel -- common/autotest_common.sh@10 -- # set +x 00:09:53.491 05:38:08 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.491 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.491 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.491 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.491 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.491 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.491 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.491 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.491 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.491 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.491 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.491 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.491 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.491 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.491 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.491 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.491 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.491 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.491 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.492 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.492 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.492 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.492 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.492 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.492 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.492 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.492 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.492 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.492 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.492 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.492 05:38:08 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # IFS== 00:09:53.492 05:38:08 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:53.492 05:38:08 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:53.492 05:38:08 accel -- accel/accel.sh@75 -- # killprocess 1099323 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@948 -- # '[' -z 1099323 ']' 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@952 -- # kill -0 1099323 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@953 -- # uname 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1099323 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1099323' 00:09:53.492 killing process with pid 1099323 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@967 -- # kill 1099323 00:09:53.492 05:38:08 accel -- common/autotest_common.sh@972 -- # wait 1099323 00:09:53.750 05:38:08 accel -- accel/accel.sh@76 -- # trap - ERR 00:09:53.750 05:38:08 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:09:53.750 05:38:08 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:53.750 05:38:08 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:53.750 05:38:08 accel -- common/autotest_common.sh@10 -- # set +x 00:09:54.009 05:38:08 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:09:54.009 05:38:08 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:09:54.009 05:38:08 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:09:54.009 05:38:08 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:54.009 05:38:08 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:54.009 05:38:08 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:54.009 05:38:08 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:54.009 05:38:08 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:54.009 05:38:08 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:09:54.009 05:38:08 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:09:54.009 05:38:08 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:54.009 05:38:08 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:09:54.009 05:38:08 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:54.009 05:38:08 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:09:54.009 05:38:08 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:54.009 05:38:08 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:54.009 05:38:08 accel -- common/autotest_common.sh@10 -- # set +x 00:09:54.009 ************************************ 00:09:54.009 START TEST accel_missing_filename 00:09:54.009 ************************************ 00:09:54.009 05:38:08 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:09:54.009 05:38:08 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:09:54.009 05:38:08 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:09:54.009 05:38:08 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:54.009 05:38:08 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:54.009 05:38:08 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:54.009 05:38:08 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:54.009 05:38:08 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:09:54.009 05:38:08 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:09:54.009 05:38:08 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:09:54.009 05:38:08 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:54.009 05:38:08 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:54.009 05:38:08 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:54.009 05:38:08 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:54.009 05:38:08 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:54.009 05:38:08 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:09:54.009 05:38:08 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:09:54.009 [2024-07-26 05:38:08.833222] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:54.009 [2024-07-26 05:38:08.833296] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1099546 ] 00:09:54.268 [2024-07-26 05:38:08.953339] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.268 [2024-07-26 05:38:09.053744] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:54.268 [2024-07-26 05:38:09.123654] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:54.527 [2024-07-26 05:38:09.187843] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:09:54.527 A filename is required. 00:09:54.527 05:38:09 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:09:54.527 05:38:09 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:54.527 05:38:09 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:09:54.527 05:38:09 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:09:54.527 05:38:09 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:09:54.527 05:38:09 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:54.527 00:09:54.527 real 0m0.485s 00:09:54.527 user 0m0.314s 00:09:54.527 sys 0m0.195s 00:09:54.527 05:38:09 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:54.527 05:38:09 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:09:54.527 ************************************ 00:09:54.527 END TEST accel_missing_filename 00:09:54.527 ************************************ 00:09:54.527 05:38:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:54.527 05:38:09 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:54.527 05:38:09 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:09:54.527 05:38:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:54.527 05:38:09 accel -- common/autotest_common.sh@10 -- # set +x 00:09:54.527 ************************************ 00:09:54.527 START TEST accel_compress_verify 00:09:54.527 ************************************ 00:09:54.527 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:54.527 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:09:54.527 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:54.527 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:54.527 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:54.527 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:54.527 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:54.527 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:54.527 05:38:09 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:54.527 05:38:09 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:09:54.527 05:38:09 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:54.527 05:38:09 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:54.527 05:38:09 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:54.527 05:38:09 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:54.527 05:38:09 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:54.527 05:38:09 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:09:54.527 05:38:09 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:09:54.527 [2024-07-26 05:38:09.401584] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:54.527 [2024-07-26 05:38:09.401658] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1099725 ] 00:09:54.786 [2024-07-26 05:38:09.533304] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.786 [2024-07-26 05:38:09.637199] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:55.044 [2024-07-26 05:38:09.703690] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:55.044 [2024-07-26 05:38:09.777720] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:09:55.044 00:09:55.044 Compression does not support the verify option, aborting. 00:09:55.044 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:09:55.044 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:55.044 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:09:55.044 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:09:55.044 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:09:55.044 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:55.044 00:09:55.044 real 0m0.508s 00:09:55.044 user 0m0.332s 00:09:55.044 sys 0m0.203s 00:09:55.044 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:55.044 05:38:09 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:09:55.044 ************************************ 00:09:55.044 END TEST accel_compress_verify 00:09:55.044 ************************************ 00:09:55.044 05:38:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:55.044 05:38:09 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:09:55.044 05:38:09 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:55.044 05:38:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:55.044 05:38:09 accel -- common/autotest_common.sh@10 -- # set +x 00:09:55.304 ************************************ 00:09:55.304 START TEST accel_wrong_workload 00:09:55.304 ************************************ 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:09:55.304 05:38:09 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:09:55.304 05:38:09 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:09:55.304 05:38:09 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:55.304 05:38:09 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:55.304 05:38:09 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:55.304 05:38:09 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:55.304 05:38:09 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:55.304 05:38:09 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:09:55.304 05:38:09 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:09:55.304 Unsupported workload type: foobar 00:09:55.304 [2024-07-26 05:38:09.993692] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:09:55.304 accel_perf options: 00:09:55.304 [-h help message] 00:09:55.304 [-q queue depth per core] 00:09:55.304 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:09:55.304 [-T number of threads per core 00:09:55.304 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:09:55.304 [-t time in seconds] 00:09:55.304 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:09:55.304 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:09:55.304 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:09:55.304 [-l for compress/decompress workloads, name of uncompressed input file 00:09:55.304 [-S for crc32c workload, use this seed value (default 0) 00:09:55.304 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:09:55.304 [-f for fill workload, use this BYTE value (default 255) 00:09:55.304 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:09:55.304 [-y verify result if this switch is on] 00:09:55.304 [-a tasks to allocate per core (default: same value as -q)] 00:09:55.304 Can be used to spread operations across a wider range of memory. 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:55.304 00:09:55.304 real 0m0.043s 00:09:55.304 user 0m0.020s 00:09:55.304 sys 0m0.023s 00:09:55.304 05:38:09 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:55.304 05:38:10 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:09:55.304 ************************************ 00:09:55.304 END TEST accel_wrong_workload 00:09:55.304 ************************************ 00:09:55.304 Error: writing output failed: Broken pipe 00:09:55.304 05:38:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:55.304 05:38:10 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:09:55.304 05:38:10 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:09:55.304 05:38:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:55.304 05:38:10 accel -- common/autotest_common.sh@10 -- # set +x 00:09:55.304 ************************************ 00:09:55.304 START TEST accel_negative_buffers 00:09:55.304 ************************************ 00:09:55.304 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:09:55.304 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:09:55.304 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:09:55.304 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:55.304 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:55.304 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:55.304 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:55.304 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:09:55.304 05:38:10 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:09:55.304 05:38:10 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:09:55.304 05:38:10 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:55.304 05:38:10 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:55.304 05:38:10 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:55.304 05:38:10 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:55.304 05:38:10 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:55.304 05:38:10 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:09:55.304 05:38:10 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:09:55.304 -x option must be non-negative. 00:09:55.304 [2024-07-26 05:38:10.102238] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:09:55.304 accel_perf options: 00:09:55.304 [-h help message] 00:09:55.304 [-q queue depth per core] 00:09:55.304 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:09:55.304 [-T number of threads per core 00:09:55.305 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:09:55.305 [-t time in seconds] 00:09:55.305 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:09:55.305 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:09:55.305 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:09:55.305 [-l for compress/decompress workloads, name of uncompressed input file 00:09:55.305 [-S for crc32c workload, use this seed value (default 0) 00:09:55.305 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:09:55.305 [-f for fill workload, use this BYTE value (default 255) 00:09:55.305 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:09:55.305 [-y verify result if this switch is on] 00:09:55.305 [-a tasks to allocate per core (default: same value as -q)] 00:09:55.305 Can be used to spread operations across a wider range of memory. 00:09:55.305 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:09:55.305 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:55.305 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:55.305 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:55.305 00:09:55.305 real 0m0.041s 00:09:55.305 user 0m0.052s 00:09:55.305 sys 0m0.020s 00:09:55.305 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:55.305 05:38:10 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:09:55.305 ************************************ 00:09:55.305 END TEST accel_negative_buffers 00:09:55.305 ************************************ 00:09:55.305 05:38:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:55.305 05:38:10 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:09:55.305 05:38:10 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:55.305 05:38:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:55.305 05:38:10 accel -- common/autotest_common.sh@10 -- # set +x 00:09:55.305 ************************************ 00:09:55.305 START TEST accel_crc32c 00:09:55.305 ************************************ 00:09:55.305 05:38:10 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:09:55.305 05:38:10 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:09:55.305 [2024-07-26 05:38:10.198801] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:55.305 [2024-07-26 05:38:10.198862] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1099802 ] 00:09:55.564 [2024-07-26 05:38:10.329978] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:55.564 [2024-07-26 05:38:10.436470] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:55.823 05:38:10 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:09:57.198 05:38:11 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:57.198 00:09:57.198 real 0m1.504s 00:09:57.198 user 0m0.008s 00:09:57.198 sys 0m0.004s 00:09:57.198 05:38:11 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:57.198 05:38:11 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:09:57.198 ************************************ 00:09:57.198 END TEST accel_crc32c 00:09:57.198 ************************************ 00:09:57.198 05:38:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:57.198 05:38:11 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:09:57.198 05:38:11 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:57.198 05:38:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:57.198 05:38:11 accel -- common/autotest_common.sh@10 -- # set +x 00:09:57.198 ************************************ 00:09:57.198 START TEST accel_crc32c_C2 00:09:57.198 ************************************ 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:09:57.198 05:38:11 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:09:57.198 [2024-07-26 05:38:11.781986] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:57.198 [2024-07-26 05:38:11.782056] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1100097 ] 00:09:57.198 [2024-07-26 05:38:11.912831] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:57.198 [2024-07-26 05:38:12.013057] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:09:57.198 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:57.199 05:38:12 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:58.575 00:09:58.575 real 0m1.510s 00:09:58.575 user 0m0.011s 00:09:58.575 sys 0m0.002s 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:58.575 05:38:13 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:09:58.575 ************************************ 00:09:58.575 END TEST accel_crc32c_C2 00:09:58.575 ************************************ 00:09:58.575 05:38:13 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:58.575 05:38:13 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:09:58.575 05:38:13 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:58.575 05:38:13 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:58.575 05:38:13 accel -- common/autotest_common.sh@10 -- # set +x 00:09:58.575 ************************************ 00:09:58.575 START TEST accel_copy 00:09:58.575 ************************************ 00:09:58.575 05:38:13 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:09:58.575 05:38:13 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:09:58.575 [2024-07-26 05:38:13.373591] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:09:58.575 [2024-07-26 05:38:13.373667] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1100350 ] 00:09:58.834 [2024-07-26 05:38:13.505349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:58.834 [2024-07-26 05:38:13.610704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:58.834 05:38:13 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:10:00.210 05:38:14 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:00.210 00:10:00.210 real 0m1.511s 00:10:00.210 user 0m0.012s 00:10:00.210 sys 0m0.001s 00:10:00.210 05:38:14 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:00.210 05:38:14 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:10:00.210 ************************************ 00:10:00.210 END TEST accel_copy 00:10:00.210 ************************************ 00:10:00.210 05:38:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:00.210 05:38:14 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:10:00.210 05:38:14 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:00.210 05:38:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:00.210 05:38:14 accel -- common/autotest_common.sh@10 -- # set +x 00:10:00.210 ************************************ 00:10:00.210 START TEST accel_fill 00:10:00.210 ************************************ 00:10:00.210 05:38:14 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:10:00.210 05:38:14 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:10:00.210 [2024-07-26 05:38:14.943608] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:00.210 [2024-07-26 05:38:14.943672] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1100547 ] 00:10:00.210 [2024-07-26 05:38:15.073032] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:00.470 [2024-07-26 05:38:15.175842] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:00.470 05:38:15 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:10:01.847 05:38:16 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:01.847 00:10:01.847 real 0m1.499s 00:10:01.847 user 0m0.010s 00:10:01.847 sys 0m0.001s 00:10:01.847 05:38:16 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:01.848 05:38:16 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:10:01.848 ************************************ 00:10:01.848 END TEST accel_fill 00:10:01.848 ************************************ 00:10:01.848 05:38:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:01.848 05:38:16 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:10:01.848 05:38:16 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:10:01.848 05:38:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:01.848 05:38:16 accel -- common/autotest_common.sh@10 -- # set +x 00:10:01.848 ************************************ 00:10:01.848 START TEST accel_copy_crc32c 00:10:01.848 ************************************ 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:10:01.848 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:10:01.848 [2024-07-26 05:38:16.533628] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:01.848 [2024-07-26 05:38:16.533722] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1100743 ] 00:10:01.848 [2024-07-26 05:38:16.666496] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:02.107 [2024-07-26 05:38:16.772256] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:02.107 05:38:16 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:03.485 00:10:03.485 real 0m1.517s 00:10:03.485 user 0m0.010s 00:10:03.485 sys 0m0.004s 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:03.485 05:38:18 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:10:03.485 ************************************ 00:10:03.485 END TEST accel_copy_crc32c 00:10:03.485 ************************************ 00:10:03.485 05:38:18 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:03.485 05:38:18 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:10:03.485 05:38:18 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:10:03.485 05:38:18 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:03.485 05:38:18 accel -- common/autotest_common.sh@10 -- # set +x 00:10:03.485 ************************************ 00:10:03.485 START TEST accel_copy_crc32c_C2 00:10:03.485 ************************************ 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:10:03.485 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:10:03.485 [2024-07-26 05:38:18.135704] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:03.485 [2024-07-26 05:38:18.135828] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1100944 ] 00:10:03.485 [2024-07-26 05:38:18.332678] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:03.752 [2024-07-26 05:38:18.436036] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.752 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:03.753 05:38:18 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:05.131 00:10:05.131 real 0m1.566s 00:10:05.131 user 0m0.011s 00:10:05.131 sys 0m0.002s 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:05.131 05:38:19 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:10:05.131 ************************************ 00:10:05.131 END TEST accel_copy_crc32c_C2 00:10:05.131 ************************************ 00:10:05.131 05:38:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:05.131 05:38:19 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:10:05.131 05:38:19 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:10:05.131 05:38:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:05.131 05:38:19 accel -- common/autotest_common.sh@10 -- # set +x 00:10:05.131 ************************************ 00:10:05.131 START TEST accel_dualcast 00:10:05.131 ************************************ 00:10:05.131 05:38:19 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:10:05.131 05:38:19 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:10:05.131 [2024-07-26 05:38:19.759035] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:05.131 [2024-07-26 05:38:19.759095] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1101180 ] 00:10:05.131 [2024-07-26 05:38:19.888060] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:05.131 [2024-07-26 05:38:19.994340] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:05.390 05:38:20 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:10:06.325 05:38:21 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:06.325 00:10:06.325 real 0m1.502s 00:10:06.325 user 0m0.010s 00:10:06.325 sys 0m0.002s 00:10:06.326 05:38:21 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:06.326 05:38:21 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:10:06.326 ************************************ 00:10:06.326 END TEST accel_dualcast 00:10:06.326 ************************************ 00:10:06.584 05:38:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:06.584 05:38:21 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:10:06.584 05:38:21 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:10:06.584 05:38:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:06.584 05:38:21 accel -- common/autotest_common.sh@10 -- # set +x 00:10:06.584 ************************************ 00:10:06.584 START TEST accel_compare 00:10:06.584 ************************************ 00:10:06.584 05:38:21 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:10:06.584 05:38:21 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:10:06.584 [2024-07-26 05:38:21.335421] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:06.584 [2024-07-26 05:38:21.335479] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1101495 ] 00:10:06.584 [2024-07-26 05:38:21.462541] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:06.844 [2024-07-26 05:38:21.564124] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:06.844 05:38:21 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:08.261 05:38:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:10:08.262 05:38:22 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:08.262 00:10:08.262 real 0m1.501s 00:10:08.262 user 0m0.009s 00:10:08.262 sys 0m0.002s 00:10:08.262 05:38:22 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:08.262 05:38:22 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:10:08.262 ************************************ 00:10:08.262 END TEST accel_compare 00:10:08.262 ************************************ 00:10:08.262 05:38:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:08.262 05:38:22 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:10:08.262 05:38:22 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:10:08.262 05:38:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:08.262 05:38:22 accel -- common/autotest_common.sh@10 -- # set +x 00:10:08.262 ************************************ 00:10:08.262 START TEST accel_xor 00:10:08.262 ************************************ 00:10:08.262 05:38:22 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:10:08.262 05:38:22 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:10:08.262 [2024-07-26 05:38:22.917998] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:08.262 [2024-07-26 05:38:22.918065] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1101697 ] 00:10:08.262 [2024-07-26 05:38:23.050734] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:08.262 [2024-07-26 05:38:23.155508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:10:08.521 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.522 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:08.523 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.523 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.523 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:08.523 05:38:23 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:08.523 05:38:23 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:08.523 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:08.523 05:38:23 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:09.898 00:10:09.898 real 0m1.506s 00:10:09.898 user 0m0.008s 00:10:09.898 sys 0m0.006s 00:10:09.898 05:38:24 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:09.898 05:38:24 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:10:09.898 ************************************ 00:10:09.898 END TEST accel_xor 00:10:09.898 ************************************ 00:10:09.898 05:38:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:09.898 05:38:24 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:10:09.898 05:38:24 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:10:09.898 05:38:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:09.898 05:38:24 accel -- common/autotest_common.sh@10 -- # set +x 00:10:09.898 ************************************ 00:10:09.898 START TEST accel_xor 00:10:09.898 ************************************ 00:10:09.898 05:38:24 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:10:09.898 [2024-07-26 05:38:24.495332] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:09.898 [2024-07-26 05:38:24.495389] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1101889 ] 00:10:09.898 [2024-07-26 05:38:24.624930] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:09.898 [2024-07-26 05:38:24.722736] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.898 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:09.899 05:38:24 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:10:11.273 05:38:25 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:11.273 00:10:11.273 real 0m1.500s 00:10:11.273 user 0m0.010s 00:10:11.273 sys 0m0.002s 00:10:11.273 05:38:25 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:11.273 05:38:25 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:10:11.273 ************************************ 00:10:11.273 END TEST accel_xor 00:10:11.273 ************************************ 00:10:11.273 05:38:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:11.273 05:38:25 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:10:11.273 05:38:25 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:10:11.273 05:38:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:11.273 05:38:25 accel -- common/autotest_common.sh@10 -- # set +x 00:10:11.273 ************************************ 00:10:11.273 START TEST accel_dif_verify 00:10:11.273 ************************************ 00:10:11.273 05:38:26 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:11.273 05:38:26 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:11.274 05:38:26 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:10:11.274 05:38:26 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:10:11.274 [2024-07-26 05:38:26.047878] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:11.274 [2024-07-26 05:38:26.047923] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1102095 ] 00:10:11.274 [2024-07-26 05:38:26.159597] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:11.533 [2024-07-26 05:38:26.258608] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.533 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:11.534 05:38:26 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:10:12.911 05:38:27 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:12.911 00:10:12.911 real 0m1.453s 00:10:12.911 user 0m0.012s 00:10:12.911 sys 0m0.001s 00:10:12.911 05:38:27 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:12.911 05:38:27 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:10:12.911 ************************************ 00:10:12.911 END TEST accel_dif_verify 00:10:12.911 ************************************ 00:10:12.911 05:38:27 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:12.911 05:38:27 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:10:12.911 05:38:27 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:10:12.911 05:38:27 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:12.911 05:38:27 accel -- common/autotest_common.sh@10 -- # set +x 00:10:12.911 ************************************ 00:10:12.911 START TEST accel_dif_generate 00:10:12.911 ************************************ 00:10:12.911 05:38:27 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:10:12.911 05:38:27 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:10:12.911 [2024-07-26 05:38:27.602920] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:12.911 [2024-07-26 05:38:27.603051] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1102287 ] 00:10:12.911 [2024-07-26 05:38:27.801840] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:13.170 [2024-07-26 05:38:27.909210] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:13.170 05:38:27 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:10:14.546 05:38:29 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:14.546 00:10:14.546 real 0m1.598s 00:10:14.546 user 0m0.012s 00:10:14.546 sys 0m0.001s 00:10:14.546 05:38:29 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:14.546 05:38:29 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:10:14.546 ************************************ 00:10:14.546 END TEST accel_dif_generate 00:10:14.546 ************************************ 00:10:14.546 05:38:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:14.546 05:38:29 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:10:14.546 05:38:29 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:10:14.546 05:38:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:14.546 05:38:29 accel -- common/autotest_common.sh@10 -- # set +x 00:10:14.546 ************************************ 00:10:14.546 START TEST accel_dif_generate_copy 00:10:14.546 ************************************ 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:10:14.546 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:10:14.546 [2024-07-26 05:38:29.272002] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:14.546 [2024-07-26 05:38:29.272070] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1102540 ] 00:10:14.546 [2024-07-26 05:38:29.401620] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:14.806 [2024-07-26 05:38:29.507265] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:14.806 05:38:29 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:16.182 00:10:16.182 real 0m1.515s 00:10:16.182 user 0m1.328s 00:10:16.182 sys 0m0.193s 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:16.182 05:38:30 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:10:16.182 ************************************ 00:10:16.182 END TEST accel_dif_generate_copy 00:10:16.182 ************************************ 00:10:16.182 05:38:30 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:16.182 05:38:30 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:10:16.182 05:38:30 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:16.182 05:38:30 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:10:16.182 05:38:30 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:16.182 05:38:30 accel -- common/autotest_common.sh@10 -- # set +x 00:10:16.182 ************************************ 00:10:16.182 START TEST accel_comp 00:10:16.182 ************************************ 00:10:16.182 05:38:30 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:10:16.182 05:38:30 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:10:16.182 [2024-07-26 05:38:30.870227] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:16.182 [2024-07-26 05:38:30.870288] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1102837 ] 00:10:16.182 [2024-07-26 05:38:30.999617] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:16.441 [2024-07-26 05:38:31.102942] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:16.441 05:38:31 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:16.442 05:38:31 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:16.442 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:16.442 05:38:31 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:17.817 05:38:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:10:17.818 05:38:32 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:17.818 00:10:17.818 real 0m1.515s 00:10:17.818 user 0m1.324s 00:10:17.818 sys 0m0.199s 00:10:17.818 05:38:32 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:17.818 05:38:32 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:10:17.818 ************************************ 00:10:17.818 END TEST accel_comp 00:10:17.818 ************************************ 00:10:17.818 05:38:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:17.818 05:38:32 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:17.818 05:38:32 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:10:17.818 05:38:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:17.818 05:38:32 accel -- common/autotest_common.sh@10 -- # set +x 00:10:17.818 ************************************ 00:10:17.818 START TEST accel_decomp 00:10:17.818 ************************************ 00:10:17.818 05:38:32 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:10:17.818 05:38:32 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:10:17.818 [2024-07-26 05:38:32.466477] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:17.818 [2024-07-26 05:38:32.466539] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1103042 ] 00:10:17.818 [2024-07-26 05:38:32.595889] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:17.818 [2024-07-26 05:38:32.693455] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:18.078 05:38:32 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:19.011 05:38:33 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:19.269 05:38:33 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:19.269 00:10:19.269 real 0m1.490s 00:10:19.269 user 0m1.312s 00:10:19.269 sys 0m0.187s 00:10:19.269 05:38:33 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:19.269 05:38:33 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:10:19.269 ************************************ 00:10:19.269 END TEST accel_decomp 00:10:19.269 ************************************ 00:10:19.269 05:38:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:19.269 05:38:33 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:10:19.269 05:38:33 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:19.269 05:38:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:19.269 05:38:33 accel -- common/autotest_common.sh@10 -- # set +x 00:10:19.269 ************************************ 00:10:19.269 START TEST accel_decomp_full 00:10:19.269 ************************************ 00:10:19.269 05:38:34 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:10:19.269 05:38:34 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:10:19.269 05:38:34 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:10:19.269 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.269 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:10:19.270 05:38:34 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:10:19.270 [2024-07-26 05:38:34.033350] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:19.270 [2024-07-26 05:38:34.033407] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1103237 ] 00:10:19.270 [2024-07-26 05:38:34.162681] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:19.528 [2024-07-26 05:38:34.264499] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:19.528 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:19.528 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.528 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:19.529 05:38:34 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:20.906 05:38:35 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:20.906 00:10:20.906 real 0m1.513s 00:10:20.906 user 0m1.328s 00:10:20.906 sys 0m0.193s 00:10:20.906 05:38:35 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:20.906 05:38:35 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:10:20.906 ************************************ 00:10:20.906 END TEST accel_decomp_full 00:10:20.906 ************************************ 00:10:20.906 05:38:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:20.906 05:38:35 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:10:20.906 05:38:35 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:20.906 05:38:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:20.906 05:38:35 accel -- common/autotest_common.sh@10 -- # set +x 00:10:20.906 ************************************ 00:10:20.906 START TEST accel_decomp_mcore 00:10:20.906 ************************************ 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:10:20.906 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:10:20.906 [2024-07-26 05:38:35.624958] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:20.907 [2024-07-26 05:38:35.625015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1103448 ] 00:10:20.907 [2024-07-26 05:38:35.756110] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:10:21.166 [2024-07-26 05:38:35.861901] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:21.166 [2024-07-26 05:38:35.861986] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:10:21.166 [2024-07-26 05:38:35.862063] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:10:21.166 [2024-07-26 05:38:35.862071] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.166 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:21.167 05:38:35 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:22.547 00:10:22.547 real 0m1.528s 00:10:22.547 user 0m4.780s 00:10:22.547 sys 0m0.203s 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:22.547 05:38:37 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:10:22.547 ************************************ 00:10:22.547 END TEST accel_decomp_mcore 00:10:22.547 ************************************ 00:10:22.547 05:38:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:22.547 05:38:37 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:10:22.547 05:38:37 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:22.547 05:38:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:22.547 05:38:37 accel -- common/autotest_common.sh@10 -- # set +x 00:10:22.547 ************************************ 00:10:22.547 START TEST accel_decomp_full_mcore 00:10:22.547 ************************************ 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:10:22.547 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:10:22.547 [2024-07-26 05:38:37.227728] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:22.547 [2024-07-26 05:38:37.227786] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1103653 ] 00:10:22.547 [2024-07-26 05:38:37.358537] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:10:22.807 [2024-07-26 05:38:37.464312] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:22.807 [2024-07-26 05:38:37.464398] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:10:22.807 [2024-07-26 05:38:37.464473] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:10:22.807 [2024-07-26 05:38:37.464481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.807 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:22.808 05:38:37 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.184 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:24.184 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:24.184 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:24.184 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.184 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:24.184 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:24.184 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:24.184 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.184 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:24.185 00:10:24.185 real 0m1.546s 00:10:24.185 user 0m4.839s 00:10:24.185 sys 0m0.213s 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:24.185 05:38:38 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:10:24.185 ************************************ 00:10:24.185 END TEST accel_decomp_full_mcore 00:10:24.185 ************************************ 00:10:24.185 05:38:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:24.185 05:38:38 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:10:24.185 05:38:38 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:24.185 05:38:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:24.185 05:38:38 accel -- common/autotest_common.sh@10 -- # set +x 00:10:24.185 ************************************ 00:10:24.185 START TEST accel_decomp_mthread 00:10:24.185 ************************************ 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:10:24.185 05:38:38 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:10:24.185 [2024-07-26 05:38:38.846754] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:24.185 [2024-07-26 05:38:38.846816] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1103922 ] 00:10:24.185 [2024-07-26 05:38:38.976228] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:24.185 [2024-07-26 05:38:39.077294] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:24.444 05:38:39 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:25.431 00:10:25.431 real 0m1.505s 00:10:25.431 user 0m1.332s 00:10:25.431 sys 0m0.179s 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:25.431 05:38:40 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:10:25.431 ************************************ 00:10:25.431 END TEST accel_decomp_mthread 00:10:25.431 ************************************ 00:10:25.690 05:38:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:25.690 05:38:40 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:10:25.690 05:38:40 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:25.690 05:38:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:25.690 05:38:40 accel -- common/autotest_common.sh@10 -- # set +x 00:10:25.690 ************************************ 00:10:25.690 START TEST accel_decomp_full_mthread 00:10:25.690 ************************************ 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:10:25.690 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:10:25.690 [2024-07-26 05:38:40.432391] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:25.690 [2024-07-26 05:38:40.432448] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1104212 ] 00:10:25.691 [2024-07-26 05:38:40.558521] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:25.950 [2024-07-26 05:38:40.657267] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:25.950 05:38:40 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:27.327 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:27.328 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:27.328 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:27.328 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:10:27.328 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:27.328 05:38:41 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:10:27.328 00:10:27.328 real 0m1.521s 00:10:27.328 user 0m1.327s 00:10:27.328 sys 0m0.202s 00:10:27.328 05:38:41 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:27.328 05:38:41 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:10:27.328 ************************************ 00:10:27.328 END TEST accel_decomp_full_mthread 00:10:27.328 ************************************ 00:10:27.328 05:38:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:27.328 05:38:41 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:10:27.328 05:38:41 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:10:27.328 05:38:41 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:10:27.328 05:38:41 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:10:27.328 05:38:41 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1104401 00:10:27.328 05:38:41 accel -- accel/accel.sh@63 -- # waitforlisten 1104401 00:10:27.328 05:38:41 accel -- common/autotest_common.sh@829 -- # '[' -z 1104401 ']' 00:10:27.328 05:38:41 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:27.328 05:38:41 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:27.328 05:38:41 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:10:27.328 05:38:41 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:27.328 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:27.328 05:38:41 accel -- accel/accel.sh@61 -- # build_accel_config 00:10:27.328 05:38:41 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:27.328 05:38:41 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:27.328 05:38:41 accel -- common/autotest_common.sh@10 -- # set +x 00:10:27.328 05:38:41 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:27.328 05:38:41 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:27.328 05:38:41 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:27.328 05:38:41 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:10:27.328 05:38:41 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:10:27.328 05:38:41 accel -- accel/accel.sh@40 -- # local IFS=, 00:10:27.328 05:38:41 accel -- accel/accel.sh@41 -- # jq -r . 00:10:27.328 [2024-07-26 05:38:42.033583] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:27.328 [2024-07-26 05:38:42.033656] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1104401 ] 00:10:27.328 [2024-07-26 05:38:42.161816] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:27.587 [2024-07-26 05:38:42.260946] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:28.152 [2024-07-26 05:38:43.023754] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:10:28.410 05:38:43 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:28.410 05:38:43 accel -- common/autotest_common.sh@862 -- # return 0 00:10:28.410 05:38:43 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:10:28.410 05:38:43 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:10:28.410 05:38:43 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:10:28.410 05:38:43 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:10:28.410 05:38:43 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:10:28.410 05:38:43 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:10:28.410 05:38:43 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:28.410 05:38:43 accel -- common/autotest_common.sh@10 -- # set +x 00:10:28.410 05:38:43 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:10:28.410 05:38:43 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:28.670 "method": "compressdev_scan_accel_module", 00:10:28.670 05:38:43 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:10:28.670 05:38:43 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:28.670 05:38:43 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@10 -- # set +x 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # IFS== 00:10:28.670 05:38:43 accel -- accel/accel.sh@72 -- # read -r opc module 00:10:28.670 05:38:43 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:10:28.670 05:38:43 accel -- accel/accel.sh@75 -- # killprocess 1104401 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@948 -- # '[' -z 1104401 ']' 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@952 -- # kill -0 1104401 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@953 -- # uname 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1104401 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1104401' 00:10:28.670 killing process with pid 1104401 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@967 -- # kill 1104401 00:10:28.670 05:38:43 accel -- common/autotest_common.sh@972 -- # wait 1104401 00:10:29.238 05:38:43 accel -- accel/accel.sh@76 -- # trap - ERR 00:10:29.238 05:38:43 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:29.238 05:38:43 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:10:29.238 05:38:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:29.238 05:38:43 accel -- common/autotest_common.sh@10 -- # set +x 00:10:29.238 ************************************ 00:10:29.238 START TEST accel_cdev_comp 00:10:29.238 ************************************ 00:10:29.238 05:38:43 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:10:29.238 05:38:43 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:10:29.238 [2024-07-26 05:38:43.954186] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:29.238 [2024-07-26 05:38:43.954251] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1104604 ] 00:10:29.238 [2024-07-26 05:38:44.070951] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:29.496 [2024-07-26 05:38:44.178874] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:30.063 [2024-07-26 05:38:44.950636] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:10:30.063 [2024-07-26 05:38:44.953284] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x123d080 PMD being used: compress_qat 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.063 [2024-07-26 05:38:44.957416] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1241e60 PMD being used: compress_qat 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.063 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.064 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:30.322 05:38:44 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:10:31.258 05:38:46 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:10:31.258 00:10:31.258 real 0m2.220s 00:10:31.258 user 0m1.656s 00:10:31.258 sys 0m0.560s 00:10:31.258 05:38:46 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:31.258 05:38:46 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:10:31.258 ************************************ 00:10:31.258 END TEST accel_cdev_comp 00:10:31.258 ************************************ 00:10:31.516 05:38:46 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:31.516 05:38:46 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:31.516 05:38:46 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:10:31.516 05:38:46 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:31.516 05:38:46 accel -- common/autotest_common.sh@10 -- # set +x 00:10:31.516 ************************************ 00:10:31.516 START TEST accel_cdev_decomp 00:10:31.516 ************************************ 00:10:31.516 05:38:46 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:31.516 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:10:31.516 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:10:31.516 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:31.516 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:31.516 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:10:31.517 05:38:46 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:10:31.517 [2024-07-26 05:38:46.262649] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:31.517 [2024-07-26 05:38:46.262719] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1104971 ] 00:10:31.517 [2024-07-26 05:38:46.378417] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:31.775 [2024-07-26 05:38:46.485368] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:32.710 [2024-07-26 05:38:47.254445] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:10:32.710 [2024-07-26 05:38:47.257114] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2108080 PMD being used: compress_qat 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 [2024-07-26 05:38:47.261389] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x210ce60 PMD being used: compress_qat 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:32.710 05:38:47 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:10:33.647 00:10:33.647 real 0m2.201s 00:10:33.647 user 0m1.642s 00:10:33.647 sys 0m0.563s 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:33.647 05:38:48 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:10:33.647 ************************************ 00:10:33.647 END TEST accel_cdev_decomp 00:10:33.648 ************************************ 00:10:33.648 05:38:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:33.648 05:38:48 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:10:33.648 05:38:48 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:33.648 05:38:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:33.648 05:38:48 accel -- common/autotest_common.sh@10 -- # set +x 00:10:33.648 ************************************ 00:10:33.648 START TEST accel_cdev_decomp_full 00:10:33.648 ************************************ 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:10:33.648 05:38:48 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:10:33.648 [2024-07-26 05:38:48.546409] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:33.648 [2024-07-26 05:38:48.546473] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1105339 ] 00:10:33.906 [2024-07-26 05:38:48.677700] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:33.906 [2024-07-26 05:38:48.782789] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:34.843 [2024-07-26 05:38:49.568148] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:10:34.843 [2024-07-26 05:38:49.570799] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12dc080 PMD being used: compress_qat 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:34.843 [2024-07-26 05:38:49.574338] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12dbce0 PMD being used: compress_qat 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:10:34.843 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:34.844 05:38:49 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:10:36.221 00:10:36.221 real 0m2.246s 00:10:36.221 user 0m1.638s 00:10:36.221 sys 0m0.609s 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:36.221 05:38:50 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:10:36.221 ************************************ 00:10:36.221 END TEST accel_cdev_decomp_full 00:10:36.221 ************************************ 00:10:36.221 05:38:50 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:36.221 05:38:50 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:10:36.221 05:38:50 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:36.221 05:38:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:36.221 05:38:50 accel -- common/autotest_common.sh@10 -- # set +x 00:10:36.221 ************************************ 00:10:36.221 START TEST accel_cdev_decomp_mcore 00:10:36.221 ************************************ 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:10:36.221 05:38:50 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:10:36.221 [2024-07-26 05:38:50.874891] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:36.221 [2024-07-26 05:38:50.874951] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1105566 ] 00:10:36.221 [2024-07-26 05:38:51.005738] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:10:36.221 [2024-07-26 05:38:51.107286] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:36.221 [2024-07-26 05:38:51.107371] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:10:36.221 [2024-07-26 05:38:51.107447] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:10:36.221 [2024-07-26 05:38:51.107453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:37.158 [2024-07-26 05:38:51.865437] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:10:37.158 [2024-07-26 05:38:51.868065] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2349720 PMD being used: compress_qat 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 [2024-07-26 05:38:51.873711] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fba2c19b8b0 PMD being used: compress_qat 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 [2024-07-26 05:38:51.874413] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fba2419b8b0 PMD being used: compress_qat 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 [2024-07-26 05:38:51.875522] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x234e9f0 PMD being used: compress_qat 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:37.158 [2024-07-26 05:38:51.875773] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fba1c19b8b0 PMD being used: compress_qat 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.158 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.159 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.159 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:37.159 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.159 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.159 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:37.159 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:37.159 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:37.159 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:37.159 05:38:51 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:10:38.536 00:10:38.536 real 0m2.235s 00:10:38.536 user 0m7.220s 00:10:38.536 sys 0m0.596s 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:38.536 05:38:53 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:10:38.536 ************************************ 00:10:38.536 END TEST accel_cdev_decomp_mcore 00:10:38.536 ************************************ 00:10:38.536 05:38:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:38.536 05:38:53 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:10:38.536 05:38:53 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:38.536 05:38:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:38.536 05:38:53 accel -- common/autotest_common.sh@10 -- # set +x 00:10:38.536 ************************************ 00:10:38.536 START TEST accel_cdev_decomp_full_mcore 00:10:38.536 ************************************ 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:38.536 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:10:38.537 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:10:38.537 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:10:38.537 05:38:53 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:10:38.537 [2024-07-26 05:38:53.181507] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:38.537 [2024-07-26 05:38:53.181565] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1105911 ] 00:10:38.537 [2024-07-26 05:38:53.310177] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:10:38.537 [2024-07-26 05:38:53.415696] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:38.537 [2024-07-26 05:38:53.415783] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:10:38.537 [2024-07-26 05:38:53.415862] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:10:38.537 [2024-07-26 05:38:53.415870] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:39.473 [2024-07-26 05:38:54.178220] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:10:39.473 [2024-07-26 05:38:54.180819] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xbdb720 PMD being used: compress_qat 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.473 [2024-07-26 05:38:54.185615] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f6b2c19b8b0 PMD being used: compress_qat 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.473 [2024-07-26 05:38:54.186279] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f6b2419b8b0 PMD being used: compress_qat 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.473 [2024-07-26 05:38:54.187416] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xbdea30 PMD being used: compress_qat 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:39.473 [2024-07-26 05:38:54.187642] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f6b1c19b8b0 PMD being used: compress_qat 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.473 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:39.474 05:38:54 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:10:40.855 00:10:40.855 real 0m2.233s 00:10:40.855 user 0m7.245s 00:10:40.855 sys 0m0.564s 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:40.855 05:38:55 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:10:40.855 ************************************ 00:10:40.855 END TEST accel_cdev_decomp_full_mcore 00:10:40.855 ************************************ 00:10:40.855 05:38:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:40.855 05:38:55 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:10:40.855 05:38:55 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:40.855 05:38:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:40.855 05:38:55 accel -- common/autotest_common.sh@10 -- # set +x 00:10:40.855 ************************************ 00:10:40.855 START TEST accel_cdev_decomp_mthread 00:10:40.855 ************************************ 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:10:40.855 05:38:55 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:10:40.855 [2024-07-26 05:38:55.493673] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:40.855 [2024-07-26 05:38:55.493733] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1106278 ] 00:10:40.855 [2024-07-26 05:38:55.623765] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:40.855 [2024-07-26 05:38:55.724309] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.795 [2024-07-26 05:38:56.485455] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:10:41.795 [2024-07-26 05:38:56.488182] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dff080 PMD being used: compress_qat 00:10:41.795 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:41.795 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.795 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.795 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.795 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:41.795 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.795 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.795 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:41.796 [2024-07-26 05:38:56.493167] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e042a0 PMD being used: compress_qat 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:41.796 [2024-07-26 05:38:56.495769] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f270f0 PMD being used: compress_qat 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:41.796 05:38:56 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:10:43.176 00:10:43.176 real 0m2.206s 00:10:43.176 user 0m1.633s 00:10:43.176 sys 0m0.577s 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:43.176 05:38:57 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:10:43.176 ************************************ 00:10:43.176 END TEST accel_cdev_decomp_mthread 00:10:43.176 ************************************ 00:10:43.176 05:38:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:43.176 05:38:57 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:10:43.176 05:38:57 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:43.176 05:38:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:43.176 05:38:57 accel -- common/autotest_common.sh@10 -- # set +x 00:10:43.176 ************************************ 00:10:43.176 START TEST accel_cdev_decomp_full_mthread 00:10:43.176 ************************************ 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:10:43.176 05:38:57 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:10:43.176 [2024-07-26 05:38:57.781208] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:43.176 [2024-07-26 05:38:57.781268] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1106576 ] 00:10:43.176 [2024-07-26 05:38:57.911824] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:43.176 [2024-07-26 05:38:58.011962] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.116 [2024-07-26 05:38:58.787070] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:10:44.116 [2024-07-26 05:38:58.789808] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2538080 PMD being used: compress_qat 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:44.116 [2024-07-26 05:38:58.794235] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x253b3b0 PMD being used: compress_qat 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:44.116 [2024-07-26 05:38:58.797212] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x265fcc0 PMD being used: compress_qat 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:44.116 05:38:58 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:10:45.497 00:10:45.497 real 0m2.237s 00:10:45.497 user 0m1.637s 00:10:45.497 sys 0m0.600s 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:45.497 05:38:59 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:10:45.497 ************************************ 00:10:45.497 END TEST accel_cdev_decomp_full_mthread 00:10:45.497 ************************************ 00:10:45.497 05:39:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:45.497 05:39:00 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:10:45.497 05:39:00 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:10:45.497 05:39:00 accel -- accel/accel.sh@137 -- # build_accel_config 00:10:45.497 05:39:00 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:10:45.497 05:39:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:45.498 05:39:00 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:10:45.498 05:39:00 accel -- common/autotest_common.sh@10 -- # set +x 00:10:45.498 05:39:00 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:10:45.498 05:39:00 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:10:45.498 05:39:00 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:10:45.498 05:39:00 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:10:45.498 05:39:00 accel -- accel/accel.sh@40 -- # local IFS=, 00:10:45.498 05:39:00 accel -- accel/accel.sh@41 -- # jq -r . 00:10:45.498 ************************************ 00:10:45.498 START TEST accel_dif_functional_tests 00:10:45.498 ************************************ 00:10:45.498 05:39:00 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:10:45.498 [2024-07-26 05:39:00.115282] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:45.498 [2024-07-26 05:39:00.115343] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1106866 ] 00:10:45.498 [2024-07-26 05:39:00.244069] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:10:45.498 [2024-07-26 05:39:00.353934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:45.498 [2024-07-26 05:39:00.354019] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:10:45.498 [2024-07-26 05:39:00.354027] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:45.757 00:10:45.757 00:10:45.757 CUnit - A unit testing framework for C - Version 2.1-3 00:10:45.757 http://cunit.sourceforge.net/ 00:10:45.757 00:10:45.757 00:10:45.757 Suite: accel_dif 00:10:45.757 Test: verify: DIF generated, GUARD check ...passed 00:10:45.758 Test: verify: DIF generated, APPTAG check ...passed 00:10:45.758 Test: verify: DIF generated, REFTAG check ...passed 00:10:45.758 Test: verify: DIF not generated, GUARD check ...[2024-07-26 05:39:00.443090] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:10:45.758 passed 00:10:45.758 Test: verify: DIF not generated, APPTAG check ...[2024-07-26 05:39:00.443163] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:10:45.758 passed 00:10:45.758 Test: verify: DIF not generated, REFTAG check ...[2024-07-26 05:39:00.443199] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:10:45.758 passed 00:10:45.758 Test: verify: APPTAG correct, APPTAG check ...passed 00:10:45.758 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-26 05:39:00.443276] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:10:45.758 passed 00:10:45.758 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:10:45.758 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:10:45.758 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:10:45.758 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-26 05:39:00.443450] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:10:45.758 passed 00:10:45.758 Test: verify copy: DIF generated, GUARD check ...passed 00:10:45.758 Test: verify copy: DIF generated, APPTAG check ...passed 00:10:45.758 Test: verify copy: DIF generated, REFTAG check ...passed 00:10:45.758 Test: verify copy: DIF not generated, GUARD check ...[2024-07-26 05:39:00.443623] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:10:45.758 passed 00:10:45.758 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-26 05:39:00.443671] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:10:45.758 passed 00:10:45.758 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-26 05:39:00.443718] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:10:45.758 passed 00:10:45.758 Test: generate copy: DIF generated, GUARD check ...passed 00:10:45.758 Test: generate copy: DIF generated, APTTAG check ...passed 00:10:45.758 Test: generate copy: DIF generated, REFTAG check ...passed 00:10:45.758 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:10:45.758 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:10:45.758 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:10:45.758 Test: generate copy: iovecs-len validate ...[2024-07-26 05:39:00.443976] dif.c:1225:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:10:45.758 passed 00:10:45.758 Test: generate copy: buffer alignment validate ...passed 00:10:45.758 00:10:45.758 Run Summary: Type Total Ran Passed Failed Inactive 00:10:45.758 suites 1 1 n/a 0 0 00:10:45.758 tests 26 26 26 0 0 00:10:45.758 asserts 115 115 115 0 n/a 00:10:45.758 00:10:45.758 Elapsed time = 0.003 seconds 00:10:45.758 00:10:45.758 real 0m0.579s 00:10:45.758 user 0m0.790s 00:10:45.758 sys 0m0.217s 00:10:45.758 05:39:00 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:45.758 05:39:00 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:10:45.758 ************************************ 00:10:45.758 END TEST accel_dif_functional_tests 00:10:45.758 ************************************ 00:10:46.017 05:39:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:10:46.017 00:10:46.017 real 0m53.558s 00:10:46.017 user 1m1.598s 00:10:46.017 sys 0m12.020s 00:10:46.017 05:39:00 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:46.017 05:39:00 accel -- common/autotest_common.sh@10 -- # set +x 00:10:46.017 ************************************ 00:10:46.017 END TEST accel 00:10:46.017 ************************************ 00:10:46.017 05:39:00 -- common/autotest_common.sh@1142 -- # return 0 00:10:46.017 05:39:00 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:10:46.017 05:39:00 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:46.017 05:39:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:46.017 05:39:00 -- common/autotest_common.sh@10 -- # set +x 00:10:46.017 ************************************ 00:10:46.017 START TEST accel_rpc 00:10:46.017 ************************************ 00:10:46.017 05:39:00 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:10:46.017 * Looking for test storage... 00:10:46.017 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:10:46.017 05:39:00 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:10:46.017 05:39:00 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1107138 00:10:46.017 05:39:00 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1107138 00:10:46.017 05:39:00 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:10:46.017 05:39:00 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 1107138 ']' 00:10:46.017 05:39:00 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:46.017 05:39:00 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:46.017 05:39:00 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:46.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:46.017 05:39:00 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:46.017 05:39:00 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:46.276 [2024-07-26 05:39:00.936646] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:46.276 [2024-07-26 05:39:00.936723] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1107138 ] 00:10:46.276 [2024-07-26 05:39:01.079758] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:46.536 [2024-07-26 05:39:01.215138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:47.103 05:39:01 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:47.103 05:39:01 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:10:47.103 05:39:01 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:10:47.103 05:39:01 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:10:47.103 05:39:01 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:10:47.103 05:39:01 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:10:47.103 05:39:01 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:10:47.103 05:39:01 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:47.103 05:39:01 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:47.103 05:39:01 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:47.103 ************************************ 00:10:47.103 START TEST accel_assign_opcode 00:10:47.103 ************************************ 00:10:47.103 05:39:01 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:10:47.103 05:39:01 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:10:47.103 05:39:01 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:47.103 05:39:01 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:47.103 [2024-07-26 05:39:01.981574] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:10:47.104 05:39:01 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:47.104 05:39:01 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:10:47.104 05:39:01 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:47.104 05:39:01 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:47.104 [2024-07-26 05:39:01.989589] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:10:47.104 05:39:01 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:47.104 05:39:01 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:10:47.104 05:39:01 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:47.104 05:39:01 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:47.363 05:39:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:47.363 05:39:02 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:10:47.363 05:39:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:47.363 05:39:02 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:10:47.363 05:39:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:47.363 05:39:02 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:10:47.363 05:39:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:47.363 software 00:10:47.363 00:10:47.363 real 0m0.293s 00:10:47.363 user 0m0.049s 00:10:47.363 sys 0m0.015s 00:10:47.363 05:39:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:47.363 05:39:02 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:47.363 ************************************ 00:10:47.363 END TEST accel_assign_opcode 00:10:47.363 ************************************ 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:47.622 05:39:02 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1107138 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 1107138 ']' 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 1107138 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1107138 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1107138' 00:10:47.622 killing process with pid 1107138 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@967 -- # kill 1107138 00:10:47.622 05:39:02 accel_rpc -- common/autotest_common.sh@972 -- # wait 1107138 00:10:47.881 00:10:47.881 real 0m1.994s 00:10:47.881 user 0m2.126s 00:10:47.881 sys 0m0.612s 00:10:47.881 05:39:02 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:47.881 05:39:02 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:47.881 ************************************ 00:10:47.881 END TEST accel_rpc 00:10:47.881 ************************************ 00:10:48.140 05:39:02 -- common/autotest_common.sh@1142 -- # return 0 00:10:48.140 05:39:02 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:10:48.140 05:39:02 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:48.140 05:39:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:48.140 05:39:02 -- common/autotest_common.sh@10 -- # set +x 00:10:48.140 ************************************ 00:10:48.140 START TEST app_cmdline 00:10:48.140 ************************************ 00:10:48.140 05:39:02 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:10:48.140 * Looking for test storage... 00:10:48.140 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:10:48.140 05:39:02 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:10:48.140 05:39:02 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:10:48.140 05:39:02 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1107468 00:10:48.140 05:39:02 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1107468 00:10:48.140 05:39:02 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 1107468 ']' 00:10:48.140 05:39:02 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:48.140 05:39:02 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:48.140 05:39:02 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:48.140 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:48.140 05:39:02 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:48.140 05:39:02 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:10:48.140 [2024-07-26 05:39:02.988597] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:48.140 [2024-07-26 05:39:02.988679] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1107468 ] 00:10:48.399 [2024-07-26 05:39:03.118256] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:48.399 [2024-07-26 05:39:03.222860] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:49.338 05:39:03 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:49.338 05:39:03 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:10:49.338 05:39:03 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:10:49.338 { 00:10:49.338 "version": "SPDK v24.09-pre git sha1 f6e944e96", 00:10:49.338 "fields": { 00:10:49.338 "major": 24, 00:10:49.338 "minor": 9, 00:10:49.338 "patch": 0, 00:10:49.338 "suffix": "-pre", 00:10:49.338 "commit": "f6e944e96" 00:10:49.338 } 00:10:49.338 } 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@26 -- # sort 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:10:49.338 05:39:04 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:49.338 05:39:04 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:10:49.597 request: 00:10:49.597 { 00:10:49.597 "method": "env_dpdk_get_mem_stats", 00:10:49.597 "req_id": 1 00:10:49.597 } 00:10:49.597 Got JSON-RPC error response 00:10:49.597 response: 00:10:49.597 { 00:10:49.597 "code": -32601, 00:10:49.598 "message": "Method not found" 00:10:49.598 } 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:49.598 05:39:04 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1107468 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 1107468 ']' 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 1107468 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1107468 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1107468' 00:10:49.598 killing process with pid 1107468 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@967 -- # kill 1107468 00:10:49.598 05:39:04 app_cmdline -- common/autotest_common.sh@972 -- # wait 1107468 00:10:50.166 00:10:50.166 real 0m2.054s 00:10:50.166 user 0m2.452s 00:10:50.166 sys 0m0.626s 00:10:50.166 05:39:04 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:50.166 05:39:04 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:10:50.167 ************************************ 00:10:50.167 END TEST app_cmdline 00:10:50.167 ************************************ 00:10:50.167 05:39:04 -- common/autotest_common.sh@1142 -- # return 0 00:10:50.167 05:39:04 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:10:50.167 05:39:04 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:50.167 05:39:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:50.167 05:39:04 -- common/autotest_common.sh@10 -- # set +x 00:10:50.167 ************************************ 00:10:50.167 START TEST version 00:10:50.167 ************************************ 00:10:50.167 05:39:04 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:10:50.167 * Looking for test storage... 00:10:50.167 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:10:50.167 05:39:05 version -- app/version.sh@17 -- # get_header_version major 00:10:50.167 05:39:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:50.167 05:39:05 version -- app/version.sh@14 -- # cut -f2 00:10:50.167 05:39:05 version -- app/version.sh@14 -- # tr -d '"' 00:10:50.167 05:39:05 version -- app/version.sh@17 -- # major=24 00:10:50.167 05:39:05 version -- app/version.sh@18 -- # get_header_version minor 00:10:50.167 05:39:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:50.167 05:39:05 version -- app/version.sh@14 -- # cut -f2 00:10:50.167 05:39:05 version -- app/version.sh@14 -- # tr -d '"' 00:10:50.167 05:39:05 version -- app/version.sh@18 -- # minor=9 00:10:50.167 05:39:05 version -- app/version.sh@19 -- # get_header_version patch 00:10:50.427 05:39:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:50.427 05:39:05 version -- app/version.sh@14 -- # cut -f2 00:10:50.427 05:39:05 version -- app/version.sh@14 -- # tr -d '"' 00:10:50.427 05:39:05 version -- app/version.sh@19 -- # patch=0 00:10:50.427 05:39:05 version -- app/version.sh@20 -- # get_header_version suffix 00:10:50.427 05:39:05 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:50.427 05:39:05 version -- app/version.sh@14 -- # cut -f2 00:10:50.427 05:39:05 version -- app/version.sh@14 -- # tr -d '"' 00:10:50.427 05:39:05 version -- app/version.sh@20 -- # suffix=-pre 00:10:50.427 05:39:05 version -- app/version.sh@22 -- # version=24.9 00:10:50.427 05:39:05 version -- app/version.sh@25 -- # (( patch != 0 )) 00:10:50.427 05:39:05 version -- app/version.sh@28 -- # version=24.9rc0 00:10:50.427 05:39:05 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:10:50.427 05:39:05 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:10:50.427 05:39:05 version -- app/version.sh@30 -- # py_version=24.9rc0 00:10:50.427 05:39:05 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:10:50.427 00:10:50.427 real 0m0.194s 00:10:50.427 user 0m0.098s 00:10:50.427 sys 0m0.144s 00:10:50.427 05:39:05 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:50.427 05:39:05 version -- common/autotest_common.sh@10 -- # set +x 00:10:50.427 ************************************ 00:10:50.427 END TEST version 00:10:50.427 ************************************ 00:10:50.427 05:39:05 -- common/autotest_common.sh@1142 -- # return 0 00:10:50.427 05:39:05 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:10:50.427 05:39:05 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:10:50.427 05:39:05 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:50.427 05:39:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:50.427 05:39:05 -- common/autotest_common.sh@10 -- # set +x 00:10:50.427 ************************************ 00:10:50.427 START TEST blockdev_general 00:10:50.427 ************************************ 00:10:50.427 05:39:05 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:10:50.427 * Looking for test storage... 00:10:50.427 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:50.427 05:39:05 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@673 -- # uname -s 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@681 -- # test_type=bdev 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@682 -- # crypto_device= 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@683 -- # dek= 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@684 -- # env_ctx= 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@689 -- # [[ bdev == bdev ]] 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1108176 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:10:50.427 05:39:05 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1108176 00:10:50.686 05:39:05 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 1108176 ']' 00:10:50.686 05:39:05 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:50.686 05:39:05 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:50.686 05:39:05 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:50.686 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:50.686 05:39:05 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:50.686 05:39:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:50.686 [2024-07-26 05:39:05.404478] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:50.686 [2024-07-26 05:39:05.404557] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1108176 ] 00:10:50.686 [2024-07-26 05:39:05.532616] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:50.945 [2024-07-26 05:39:05.635549] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:51.511 05:39:06 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:51.511 05:39:06 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:10:51.511 05:39:06 blockdev_general -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:10:51.511 05:39:06 blockdev_general -- bdev/blockdev.sh@695 -- # setup_bdev_conf 00:10:51.511 05:39:06 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:10:51.511 05:39:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:51.511 05:39:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:51.769 [2024-07-26 05:39:06.564074] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:51.769 [2024-07-26 05:39:06.564132] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:51.769 00:10:51.769 [2024-07-26 05:39:06.572064] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:51.769 [2024-07-26 05:39:06.572100] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:51.769 00:10:51.769 Malloc0 00:10:51.769 Malloc1 00:10:51.769 Malloc2 00:10:51.769 Malloc3 00:10:51.769 Malloc4 00:10:51.769 Malloc5 00:10:51.769 Malloc6 00:10:52.027 Malloc7 00:10:52.027 Malloc8 00:10:52.027 Malloc9 00:10:52.027 [2024-07-26 05:39:06.709741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:52.027 [2024-07-26 05:39:06.709792] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:52.027 [2024-07-26 05:39:06.709820] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1710170 00:10:52.027 [2024-07-26 05:39:06.709838] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:52.027 [2024-07-26 05:39:06.711230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:52.027 [2024-07-26 05:39:06.711262] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:52.027 TestPT 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:10:52.027 5000+0 records in 00:10:52.027 5000+0 records out 00:10:52.027 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0269732 s, 380 MB/s 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:52.027 AIO0 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@739 -- # cat 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:52.027 05:39:06 blockdev_general -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:10:52.027 05:39:06 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:52.287 05:39:07 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:52.287 05:39:07 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:10:52.287 05:39:07 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r .name 00:10:52.289 05:39:07 blockdev_general -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "c8475de8-e34b-4b81-b660-8290134c4e77"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c8475de8-e34b-4b81-b660-8290134c4e77",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "87f592cb-cf24-5d45-a81a-491c1ec8c3bb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "87f592cb-cf24-5d45-a81a-491c1ec8c3bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "fac21681-e151-5569-9087-ab68dbd94c96"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "fac21681-e151-5569-9087-ab68dbd94c96",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "c274856f-58c9-5bc4-a782-fb664d8cb0f6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c274856f-58c9-5bc4-a782-fb664d8cb0f6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "d94755ba-326b-5eec-b39f-d25eb163d59c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d94755ba-326b-5eec-b39f-d25eb163d59c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "e7a89fb4-0564-58e2-bf85-7d5809fdd2b2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e7a89fb4-0564-58e2-bf85-7d5809fdd2b2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "761a5cb1-f5d6-5263-a54a-6079f514564f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "761a5cb1-f5d6-5263-a54a-6079f514564f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "bb826987-bba1-546d-bb87-189a4ad5235d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bb826987-bba1-546d-bb87-189a4ad5235d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "d3fa22eb-649a-522d-840e-1b7867c79a16"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d3fa22eb-649a-522d-840e-1b7867c79a16",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "df551536-4f54-596a-a698-60552d6d1424"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "df551536-4f54-596a-a698-60552d6d1424",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "39deb65a-b77c-5eb8-8a0f-39e3c0e08532"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "39deb65a-b77c-5eb8-8a0f-39e3c0e08532",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "2c103ac2-74ca-5737-913b-e1196f4f45aa"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2c103ac2-74ca-5737-913b-e1196f4f45aa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "118a9b42-bab2-4784-bba4-c933a9194ae7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "118a9b42-bab2-4784-bba4-c933a9194ae7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "118a9b42-bab2-4784-bba4-c933a9194ae7",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "048ec738-d116-49ab-b221-6310c0a9a37f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "c896e5f6-1868-43b8-81e8-ef121b01063b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "4da6e7d3-2fc7-45ae-92c1-d87cb5020bb0"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4da6e7d3-2fc7-45ae-92c1-d87cb5020bb0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4da6e7d3-2fc7-45ae-92c1-d87cb5020bb0",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "648893a9-ec8a-482c-89dc-9db791def4bb",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "e76582b2-4c5b-4f07-994e-f872ca8f536e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "6a8e405c-fba3-431c-8043-2b09bbc51be0"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6a8e405c-fba3-431c-8043-2b09bbc51be0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "6a8e405c-fba3-431c-8043-2b09bbc51be0",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "f7508fd8-ce27-48fb-8445-3355eb0fb4a2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "d4831adb-d523-467f-813d-05619e0a9b46",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "7781c77e-3ca2-46bf-819d-9341c4c8a87b"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "7781c77e-3ca2-46bf-819d-9341c4c8a87b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:52.289 05:39:07 blockdev_general -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:10:52.289 05:39:07 blockdev_general -- bdev/blockdev.sh@751 -- # hello_world_bdev=Malloc0 00:10:52.289 05:39:07 blockdev_general -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:10:52.289 05:39:07 blockdev_general -- bdev/blockdev.sh@753 -- # killprocess 1108176 00:10:52.289 05:39:07 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 1108176 ']' 00:10:52.289 05:39:07 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 1108176 00:10:52.289 05:39:07 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:10:52.289 05:39:07 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:52.289 05:39:07 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1108176 00:10:52.548 05:39:07 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:52.548 05:39:07 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:52.548 05:39:07 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1108176' 00:10:52.548 killing process with pid 1108176 00:10:52.548 05:39:07 blockdev_general -- common/autotest_common.sh@967 -- # kill 1108176 00:10:52.548 05:39:07 blockdev_general -- common/autotest_common.sh@972 -- # wait 1108176 00:10:53.115 05:39:07 blockdev_general -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:10:53.115 05:39:07 blockdev_general -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:10:53.115 05:39:07 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:10:53.115 05:39:07 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:53.115 05:39:07 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:53.115 ************************************ 00:10:53.115 START TEST bdev_hello_world 00:10:53.115 ************************************ 00:10:53.115 05:39:07 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:10:53.115 [2024-07-26 05:39:07.797882] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:53.115 [2024-07-26 05:39:07.797923] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1108682 ] 00:10:53.115 [2024-07-26 05:39:07.909854] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:53.115 [2024-07-26 05:39:08.010769] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:53.374 [2024-07-26 05:39:08.167920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:53.374 [2024-07-26 05:39:08.167978] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:53.374 [2024-07-26 05:39:08.167997] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:53.374 [2024-07-26 05:39:08.175911] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:53.374 [2024-07-26 05:39:08.175942] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:53.374 [2024-07-26 05:39:08.183925] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:53.374 [2024-07-26 05:39:08.183954] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:53.374 [2024-07-26 05:39:08.261079] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:53.374 [2024-07-26 05:39:08.261130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:53.374 [2024-07-26 05:39:08.261155] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd3ed90 00:10:53.374 [2024-07-26 05:39:08.261174] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:53.374 [2024-07-26 05:39:08.262650] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:53.374 [2024-07-26 05:39:08.262697] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:53.633 [2024-07-26 05:39:08.416071] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:10:53.633 [2024-07-26 05:39:08.416136] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:10:53.633 [2024-07-26 05:39:08.416201] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:10:53.633 [2024-07-26 05:39:08.416286] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:10:53.633 [2024-07-26 05:39:08.416368] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:10:53.633 [2024-07-26 05:39:08.416408] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:10:53.633 [2024-07-26 05:39:08.416487] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:10:53.633 00:10:53.633 [2024-07-26 05:39:08.416539] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:10:53.892 00:10:53.892 real 0m0.996s 00:10:53.892 user 0m0.662s 00:10:53.892 sys 0m0.289s 00:10:53.892 05:39:08 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:53.892 05:39:08 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:10:53.892 ************************************ 00:10:53.892 END TEST bdev_hello_world 00:10:53.892 ************************************ 00:10:54.150 05:39:08 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:54.150 05:39:08 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:10:54.150 05:39:08 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:54.150 05:39:08 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:54.150 05:39:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:54.150 ************************************ 00:10:54.150 START TEST bdev_bounds 00:10:54.150 ************************************ 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1108880 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1108880' 00:10:54.150 Process bdevio pid: 1108880 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1108880 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1108880 ']' 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:54.150 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:54.150 05:39:08 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:10:54.150 [2024-07-26 05:39:08.898791] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:54.150 [2024-07-26 05:39:08.898844] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1108880 ] 00:10:54.150 [2024-07-26 05:39:09.012282] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:10:54.408 [2024-07-26 05:39:09.119503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:54.408 [2024-07-26 05:39:09.119589] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:10:54.408 [2024-07-26 05:39:09.119593] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:54.408 [2024-07-26 05:39:09.271880] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:54.408 [2024-07-26 05:39:09.271946] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:54.408 [2024-07-26 05:39:09.271966] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:54.408 [2024-07-26 05:39:09.279892] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:54.408 [2024-07-26 05:39:09.279924] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:54.408 [2024-07-26 05:39:09.287906] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:54.408 [2024-07-26 05:39:09.287934] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:54.667 [2024-07-26 05:39:09.365354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:54.667 [2024-07-26 05:39:09.365413] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:54.667 [2024-07-26 05:39:09.365435] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd9dfa0 00:10:54.667 [2024-07-26 05:39:09.365453] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:54.667 [2024-07-26 05:39:09.366963] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:54.667 [2024-07-26 05:39:09.366998] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:55.235 05:39:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:55.235 05:39:09 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:10:55.235 05:39:09 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:10:55.235 I/O targets: 00:10:55.235 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:10:55.235 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:10:55.235 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:10:55.235 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:10:55.235 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:10:55.235 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:10:55.235 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:10:55.235 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:10:55.235 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:10:55.235 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:10:55.235 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:10:55.235 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:10:55.235 raid0: 131072 blocks of 512 bytes (64 MiB) 00:10:55.235 concat0: 131072 blocks of 512 bytes (64 MiB) 00:10:55.235 raid1: 65536 blocks of 512 bytes (32 MiB) 00:10:55.235 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:10:55.235 00:10:55.235 00:10:55.235 CUnit - A unit testing framework for C - Version 2.1-3 00:10:55.235 http://cunit.sourceforge.net/ 00:10:55.235 00:10:55.235 00:10:55.235 Suite: bdevio tests on: AIO0 00:10:55.235 Test: blockdev write read block ...passed 00:10:55.235 Test: blockdev write zeroes read block ...passed 00:10:55.235 Test: blockdev write zeroes read no split ...passed 00:10:55.235 Test: blockdev write zeroes read split ...passed 00:10:55.235 Test: blockdev write zeroes read split partial ...passed 00:10:55.235 Test: blockdev reset ...passed 00:10:55.235 Test: blockdev write read 8 blocks ...passed 00:10:55.235 Test: blockdev write read size > 128k ...passed 00:10:55.235 Test: blockdev write read invalid size ...passed 00:10:55.235 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.235 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.235 Test: blockdev write read max offset ...passed 00:10:55.235 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.235 Test: blockdev writev readv 8 blocks ...passed 00:10:55.235 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.235 Test: blockdev writev readv block ...passed 00:10:55.235 Test: blockdev writev readv size > 128k ...passed 00:10:55.235 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.235 Test: blockdev comparev and writev ...passed 00:10:55.235 Test: blockdev nvme passthru rw ...passed 00:10:55.236 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.236 Test: blockdev nvme admin passthru ...passed 00:10:55.236 Test: blockdev copy ...passed 00:10:55.236 Suite: bdevio tests on: raid1 00:10:55.236 Test: blockdev write read block ...passed 00:10:55.236 Test: blockdev write zeroes read block ...passed 00:10:55.236 Test: blockdev write zeroes read no split ...passed 00:10:55.236 Test: blockdev write zeroes read split ...passed 00:10:55.236 Test: blockdev write zeroes read split partial ...passed 00:10:55.236 Test: blockdev reset ...passed 00:10:55.236 Test: blockdev write read 8 blocks ...passed 00:10:55.236 Test: blockdev write read size > 128k ...passed 00:10:55.236 Test: blockdev write read invalid size ...passed 00:10:55.236 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.236 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.236 Test: blockdev write read max offset ...passed 00:10:55.236 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.236 Test: blockdev writev readv 8 blocks ...passed 00:10:55.236 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.236 Test: blockdev writev readv block ...passed 00:10:55.236 Test: blockdev writev readv size > 128k ...passed 00:10:55.236 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.236 Test: blockdev comparev and writev ...passed 00:10:55.236 Test: blockdev nvme passthru rw ...passed 00:10:55.236 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.236 Test: blockdev nvme admin passthru ...passed 00:10:55.236 Test: blockdev copy ...passed 00:10:55.236 Suite: bdevio tests on: concat0 00:10:55.236 Test: blockdev write read block ...passed 00:10:55.236 Test: blockdev write zeroes read block ...passed 00:10:55.236 Test: blockdev write zeroes read no split ...passed 00:10:55.236 Test: blockdev write zeroes read split ...passed 00:10:55.236 Test: blockdev write zeroes read split partial ...passed 00:10:55.236 Test: blockdev reset ...passed 00:10:55.236 Test: blockdev write read 8 blocks ...passed 00:10:55.236 Test: blockdev write read size > 128k ...passed 00:10:55.236 Test: blockdev write read invalid size ...passed 00:10:55.236 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.236 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.236 Test: blockdev write read max offset ...passed 00:10:55.236 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.236 Test: blockdev writev readv 8 blocks ...passed 00:10:55.236 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.236 Test: blockdev writev readv block ...passed 00:10:55.236 Test: blockdev writev readv size > 128k ...passed 00:10:55.236 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.236 Test: blockdev comparev and writev ...passed 00:10:55.236 Test: blockdev nvme passthru rw ...passed 00:10:55.236 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.236 Test: blockdev nvme admin passthru ...passed 00:10:55.236 Test: blockdev copy ...passed 00:10:55.236 Suite: bdevio tests on: raid0 00:10:55.236 Test: blockdev write read block ...passed 00:10:55.236 Test: blockdev write zeroes read block ...passed 00:10:55.236 Test: blockdev write zeroes read no split ...passed 00:10:55.236 Test: blockdev write zeroes read split ...passed 00:10:55.236 Test: blockdev write zeroes read split partial ...passed 00:10:55.236 Test: blockdev reset ...passed 00:10:55.236 Test: blockdev write read 8 blocks ...passed 00:10:55.236 Test: blockdev write read size > 128k ...passed 00:10:55.236 Test: blockdev write read invalid size ...passed 00:10:55.236 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.236 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.236 Test: blockdev write read max offset ...passed 00:10:55.236 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.236 Test: blockdev writev readv 8 blocks ...passed 00:10:55.236 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.236 Test: blockdev writev readv block ...passed 00:10:55.236 Test: blockdev writev readv size > 128k ...passed 00:10:55.236 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.236 Test: blockdev comparev and writev ...passed 00:10:55.236 Test: blockdev nvme passthru rw ...passed 00:10:55.236 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.236 Test: blockdev nvme admin passthru ...passed 00:10:55.236 Test: blockdev copy ...passed 00:10:55.236 Suite: bdevio tests on: TestPT 00:10:55.236 Test: blockdev write read block ...passed 00:10:55.236 Test: blockdev write zeroes read block ...passed 00:10:55.236 Test: blockdev write zeroes read no split ...passed 00:10:55.236 Test: blockdev write zeroes read split ...passed 00:10:55.236 Test: blockdev write zeroes read split partial ...passed 00:10:55.236 Test: blockdev reset ...passed 00:10:55.236 Test: blockdev write read 8 blocks ...passed 00:10:55.236 Test: blockdev write read size > 128k ...passed 00:10:55.236 Test: blockdev write read invalid size ...passed 00:10:55.236 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.236 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.236 Test: blockdev write read max offset ...passed 00:10:55.236 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.236 Test: blockdev writev readv 8 blocks ...passed 00:10:55.236 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.236 Test: blockdev writev readv block ...passed 00:10:55.236 Test: blockdev writev readv size > 128k ...passed 00:10:55.236 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.236 Test: blockdev comparev and writev ...passed 00:10:55.236 Test: blockdev nvme passthru rw ...passed 00:10:55.236 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.236 Test: blockdev nvme admin passthru ...passed 00:10:55.236 Test: blockdev copy ...passed 00:10:55.236 Suite: bdevio tests on: Malloc2p7 00:10:55.236 Test: blockdev write read block ...passed 00:10:55.236 Test: blockdev write zeroes read block ...passed 00:10:55.236 Test: blockdev write zeroes read no split ...passed 00:10:55.236 Test: blockdev write zeroes read split ...passed 00:10:55.236 Test: blockdev write zeroes read split partial ...passed 00:10:55.236 Test: blockdev reset ...passed 00:10:55.236 Test: blockdev write read 8 blocks ...passed 00:10:55.236 Test: blockdev write read size > 128k ...passed 00:10:55.236 Test: blockdev write read invalid size ...passed 00:10:55.236 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.236 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.236 Test: blockdev write read max offset ...passed 00:10:55.236 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.236 Test: blockdev writev readv 8 blocks ...passed 00:10:55.236 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.236 Test: blockdev writev readv block ...passed 00:10:55.236 Test: blockdev writev readv size > 128k ...passed 00:10:55.236 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.236 Test: blockdev comparev and writev ...passed 00:10:55.236 Test: blockdev nvme passthru rw ...passed 00:10:55.236 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.236 Test: blockdev nvme admin passthru ...passed 00:10:55.236 Test: blockdev copy ...passed 00:10:55.236 Suite: bdevio tests on: Malloc2p6 00:10:55.236 Test: blockdev write read block ...passed 00:10:55.236 Test: blockdev write zeroes read block ...passed 00:10:55.236 Test: blockdev write zeroes read no split ...passed 00:10:55.236 Test: blockdev write zeroes read split ...passed 00:10:55.236 Test: blockdev write zeroes read split partial ...passed 00:10:55.236 Test: blockdev reset ...passed 00:10:55.236 Test: blockdev write read 8 blocks ...passed 00:10:55.236 Test: blockdev write read size > 128k ...passed 00:10:55.236 Test: blockdev write read invalid size ...passed 00:10:55.236 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.236 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.236 Test: blockdev write read max offset ...passed 00:10:55.236 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.236 Test: blockdev writev readv 8 blocks ...passed 00:10:55.236 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.236 Test: blockdev writev readv block ...passed 00:10:55.236 Test: blockdev writev readv size > 128k ...passed 00:10:55.236 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.236 Test: blockdev comparev and writev ...passed 00:10:55.236 Test: blockdev nvme passthru rw ...passed 00:10:55.236 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.236 Test: blockdev nvme admin passthru ...passed 00:10:55.236 Test: blockdev copy ...passed 00:10:55.236 Suite: bdevio tests on: Malloc2p5 00:10:55.236 Test: blockdev write read block ...passed 00:10:55.236 Test: blockdev write zeroes read block ...passed 00:10:55.236 Test: blockdev write zeroes read no split ...passed 00:10:55.236 Test: blockdev write zeroes read split ...passed 00:10:55.236 Test: blockdev write zeroes read split partial ...passed 00:10:55.236 Test: blockdev reset ...passed 00:10:55.236 Test: blockdev write read 8 blocks ...passed 00:10:55.236 Test: blockdev write read size > 128k ...passed 00:10:55.236 Test: blockdev write read invalid size ...passed 00:10:55.236 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.236 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.236 Test: blockdev write read max offset ...passed 00:10:55.236 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.236 Test: blockdev writev readv 8 blocks ...passed 00:10:55.236 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.236 Test: blockdev writev readv block ...passed 00:10:55.236 Test: blockdev writev readv size > 128k ...passed 00:10:55.236 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.236 Test: blockdev comparev and writev ...passed 00:10:55.236 Test: blockdev nvme passthru rw ...passed 00:10:55.236 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.236 Test: blockdev nvme admin passthru ...passed 00:10:55.236 Test: blockdev copy ...passed 00:10:55.236 Suite: bdevio tests on: Malloc2p4 00:10:55.236 Test: blockdev write read block ...passed 00:10:55.236 Test: blockdev write zeroes read block ...passed 00:10:55.236 Test: blockdev write zeroes read no split ...passed 00:10:55.237 Test: blockdev write zeroes read split ...passed 00:10:55.237 Test: blockdev write zeroes read split partial ...passed 00:10:55.237 Test: blockdev reset ...passed 00:10:55.237 Test: blockdev write read 8 blocks ...passed 00:10:55.237 Test: blockdev write read size > 128k ...passed 00:10:55.237 Test: blockdev write read invalid size ...passed 00:10:55.237 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.237 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.237 Test: blockdev write read max offset ...passed 00:10:55.237 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.237 Test: blockdev writev readv 8 blocks ...passed 00:10:55.237 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.237 Test: blockdev writev readv block ...passed 00:10:55.237 Test: blockdev writev readv size > 128k ...passed 00:10:55.237 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.237 Test: blockdev comparev and writev ...passed 00:10:55.237 Test: blockdev nvme passthru rw ...passed 00:10:55.237 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.237 Test: blockdev nvme admin passthru ...passed 00:10:55.237 Test: blockdev copy ...passed 00:10:55.237 Suite: bdevio tests on: Malloc2p3 00:10:55.237 Test: blockdev write read block ...passed 00:10:55.237 Test: blockdev write zeroes read block ...passed 00:10:55.237 Test: blockdev write zeroes read no split ...passed 00:10:55.237 Test: blockdev write zeroes read split ...passed 00:10:55.237 Test: blockdev write zeroes read split partial ...passed 00:10:55.237 Test: blockdev reset ...passed 00:10:55.237 Test: blockdev write read 8 blocks ...passed 00:10:55.237 Test: blockdev write read size > 128k ...passed 00:10:55.237 Test: blockdev write read invalid size ...passed 00:10:55.237 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.237 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.237 Test: blockdev write read max offset ...passed 00:10:55.237 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.237 Test: blockdev writev readv 8 blocks ...passed 00:10:55.237 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.237 Test: blockdev writev readv block ...passed 00:10:55.237 Test: blockdev writev readv size > 128k ...passed 00:10:55.237 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.237 Test: blockdev comparev and writev ...passed 00:10:55.237 Test: blockdev nvme passthru rw ...passed 00:10:55.237 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.237 Test: blockdev nvme admin passthru ...passed 00:10:55.237 Test: blockdev copy ...passed 00:10:55.237 Suite: bdevio tests on: Malloc2p2 00:10:55.237 Test: blockdev write read block ...passed 00:10:55.237 Test: blockdev write zeroes read block ...passed 00:10:55.237 Test: blockdev write zeroes read no split ...passed 00:10:55.237 Test: blockdev write zeroes read split ...passed 00:10:55.237 Test: blockdev write zeroes read split partial ...passed 00:10:55.237 Test: blockdev reset ...passed 00:10:55.237 Test: blockdev write read 8 blocks ...passed 00:10:55.237 Test: blockdev write read size > 128k ...passed 00:10:55.237 Test: blockdev write read invalid size ...passed 00:10:55.237 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.237 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.237 Test: blockdev write read max offset ...passed 00:10:55.237 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.237 Test: blockdev writev readv 8 blocks ...passed 00:10:55.237 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.237 Test: blockdev writev readv block ...passed 00:10:55.237 Test: blockdev writev readv size > 128k ...passed 00:10:55.237 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.237 Test: blockdev comparev and writev ...passed 00:10:55.237 Test: blockdev nvme passthru rw ...passed 00:10:55.237 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.237 Test: blockdev nvme admin passthru ...passed 00:10:55.237 Test: blockdev copy ...passed 00:10:55.237 Suite: bdevio tests on: Malloc2p1 00:10:55.237 Test: blockdev write read block ...passed 00:10:55.237 Test: blockdev write zeroes read block ...passed 00:10:55.237 Test: blockdev write zeroes read no split ...passed 00:10:55.237 Test: blockdev write zeroes read split ...passed 00:10:55.237 Test: blockdev write zeroes read split partial ...passed 00:10:55.237 Test: blockdev reset ...passed 00:10:55.237 Test: blockdev write read 8 blocks ...passed 00:10:55.237 Test: blockdev write read size > 128k ...passed 00:10:55.237 Test: blockdev write read invalid size ...passed 00:10:55.237 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.237 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.237 Test: blockdev write read max offset ...passed 00:10:55.237 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.237 Test: blockdev writev readv 8 blocks ...passed 00:10:55.237 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.237 Test: blockdev writev readv block ...passed 00:10:55.237 Test: blockdev writev readv size > 128k ...passed 00:10:55.237 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.237 Test: blockdev comparev and writev ...passed 00:10:55.237 Test: blockdev nvme passthru rw ...passed 00:10:55.237 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.237 Test: blockdev nvme admin passthru ...passed 00:10:55.237 Test: blockdev copy ...passed 00:10:55.237 Suite: bdevio tests on: Malloc2p0 00:10:55.237 Test: blockdev write read block ...passed 00:10:55.237 Test: blockdev write zeroes read block ...passed 00:10:55.237 Test: blockdev write zeroes read no split ...passed 00:10:55.237 Test: blockdev write zeroes read split ...passed 00:10:55.237 Test: blockdev write zeroes read split partial ...passed 00:10:55.237 Test: blockdev reset ...passed 00:10:55.237 Test: blockdev write read 8 blocks ...passed 00:10:55.237 Test: blockdev write read size > 128k ...passed 00:10:55.237 Test: blockdev write read invalid size ...passed 00:10:55.237 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.237 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.237 Test: blockdev write read max offset ...passed 00:10:55.237 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.237 Test: blockdev writev readv 8 blocks ...passed 00:10:55.237 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.237 Test: blockdev writev readv block ...passed 00:10:55.237 Test: blockdev writev readv size > 128k ...passed 00:10:55.237 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.237 Test: blockdev comparev and writev ...passed 00:10:55.237 Test: blockdev nvme passthru rw ...passed 00:10:55.237 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.237 Test: blockdev nvme admin passthru ...passed 00:10:55.237 Test: blockdev copy ...passed 00:10:55.237 Suite: bdevio tests on: Malloc1p1 00:10:55.237 Test: blockdev write read block ...passed 00:10:55.237 Test: blockdev write zeroes read block ...passed 00:10:55.237 Test: blockdev write zeroes read no split ...passed 00:10:55.497 Test: blockdev write zeroes read split ...passed 00:10:55.497 Test: blockdev write zeroes read split partial ...passed 00:10:55.497 Test: blockdev reset ...passed 00:10:55.497 Test: blockdev write read 8 blocks ...passed 00:10:55.497 Test: blockdev write read size > 128k ...passed 00:10:55.497 Test: blockdev write read invalid size ...passed 00:10:55.497 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.497 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.497 Test: blockdev write read max offset ...passed 00:10:55.497 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.497 Test: blockdev writev readv 8 blocks ...passed 00:10:55.497 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.497 Test: blockdev writev readv block ...passed 00:10:55.497 Test: blockdev writev readv size > 128k ...passed 00:10:55.497 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.497 Test: blockdev comparev and writev ...passed 00:10:55.497 Test: blockdev nvme passthru rw ...passed 00:10:55.497 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.497 Test: blockdev nvme admin passthru ...passed 00:10:55.497 Test: blockdev copy ...passed 00:10:55.497 Suite: bdevio tests on: Malloc1p0 00:10:55.497 Test: blockdev write read block ...passed 00:10:55.497 Test: blockdev write zeroes read block ...passed 00:10:55.497 Test: blockdev write zeroes read no split ...passed 00:10:55.497 Test: blockdev write zeroes read split ...passed 00:10:55.497 Test: blockdev write zeroes read split partial ...passed 00:10:55.497 Test: blockdev reset ...passed 00:10:55.497 Test: blockdev write read 8 blocks ...passed 00:10:55.497 Test: blockdev write read size > 128k ...passed 00:10:55.497 Test: blockdev write read invalid size ...passed 00:10:55.497 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.497 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.497 Test: blockdev write read max offset ...passed 00:10:55.497 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.497 Test: blockdev writev readv 8 blocks ...passed 00:10:55.497 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.497 Test: blockdev writev readv block ...passed 00:10:55.497 Test: blockdev writev readv size > 128k ...passed 00:10:55.497 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.497 Test: blockdev comparev and writev ...passed 00:10:55.497 Test: blockdev nvme passthru rw ...passed 00:10:55.497 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.497 Test: blockdev nvme admin passthru ...passed 00:10:55.497 Test: blockdev copy ...passed 00:10:55.497 Suite: bdevio tests on: Malloc0 00:10:55.497 Test: blockdev write read block ...passed 00:10:55.497 Test: blockdev write zeroes read block ...passed 00:10:55.497 Test: blockdev write zeroes read no split ...passed 00:10:55.497 Test: blockdev write zeroes read split ...passed 00:10:55.497 Test: blockdev write zeroes read split partial ...passed 00:10:55.497 Test: blockdev reset ...passed 00:10:55.497 Test: blockdev write read 8 blocks ...passed 00:10:55.497 Test: blockdev write read size > 128k ...passed 00:10:55.497 Test: blockdev write read invalid size ...passed 00:10:55.497 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:55.497 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:55.497 Test: blockdev write read max offset ...passed 00:10:55.497 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:55.497 Test: blockdev writev readv 8 blocks ...passed 00:10:55.497 Test: blockdev writev readv 30 x 1block ...passed 00:10:55.497 Test: blockdev writev readv block ...passed 00:10:55.497 Test: blockdev writev readv size > 128k ...passed 00:10:55.497 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:55.497 Test: blockdev comparev and writev ...passed 00:10:55.497 Test: blockdev nvme passthru rw ...passed 00:10:55.497 Test: blockdev nvme passthru vendor specific ...passed 00:10:55.497 Test: blockdev nvme admin passthru ...passed 00:10:55.497 Test: blockdev copy ...passed 00:10:55.497 00:10:55.497 Run Summary: Type Total Ran Passed Failed Inactive 00:10:55.497 suites 16 16 n/a 0 0 00:10:55.497 tests 368 368 368 0 0 00:10:55.497 asserts 2224 2224 2224 0 n/a 00:10:55.497 00:10:55.497 Elapsed time = 0.506 seconds 00:10:55.497 0 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1108880 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1108880 ']' 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1108880 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1108880 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1108880' 00:10:55.498 killing process with pid 1108880 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1108880 00:10:55.498 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1108880 00:10:55.757 05:39:10 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:10:55.757 00:10:55.757 real 0m1.718s 00:10:55.757 user 0m4.337s 00:10:55.757 sys 0m0.481s 00:10:55.757 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:55.757 05:39:10 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:10:55.757 ************************************ 00:10:55.757 END TEST bdev_bounds 00:10:55.757 ************************************ 00:10:55.757 05:39:10 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:55.757 05:39:10 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:10:55.757 05:39:10 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:55.757 05:39:10 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:55.757 05:39:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:55.757 ************************************ 00:10:55.757 START TEST bdev_nbd 00:10:55.757 ************************************ 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=16 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=16 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:55.757 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:10:56.016 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:56.016 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1109089 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1109089 /var/tmp/spdk-nbd.sock 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1109089 ']' 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:10:56.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:56.017 05:39:10 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:10:56.017 [2024-07-26 05:39:10.725783] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:10:56.017 [2024-07-26 05:39:10.725852] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:56.017 [2024-07-26 05:39:10.859016] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:56.275 [2024-07-26 05:39:10.960916] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:56.275 [2024-07-26 05:39:11.111595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:56.275 [2024-07-26 05:39:11.111656] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:56.275 [2024-07-26 05:39:11.111677] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:56.275 [2024-07-26 05:39:11.119604] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:56.275 [2024-07-26 05:39:11.119646] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:56.275 [2024-07-26 05:39:11.127617] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:56.275 [2024-07-26 05:39:11.127655] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:56.568 [2024-07-26 05:39:11.200762] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:56.568 [2024-07-26 05:39:11.200813] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:56.568 [2024-07-26 05:39:11.200837] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12d6a90 00:10:56.568 [2024-07-26 05:39:11.200855] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:56.568 [2024-07-26 05:39:11.202325] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:56.568 [2024-07-26 05:39:11.202361] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:56.861 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:57.119 1+0 records in 00:10:57.119 1+0 records out 00:10:57.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270534 s, 15.1 MB/s 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:57.119 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:57.120 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:57.120 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:57.120 05:39:11 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:57.120 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:57.120 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:57.120 05:39:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:57.378 1+0 records in 00:10:57.378 1+0 records out 00:10:57.378 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272288 s, 15.0 MB/s 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:57.378 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:57.636 1+0 records in 00:10:57.636 1+0 records out 00:10:57.636 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000345354 s, 11.9 MB/s 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:57.636 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:57.894 1+0 records in 00:10:57.894 1+0 records out 00:10:57.894 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00037 s, 11.1 MB/s 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:57.894 05:39:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:10:58.153 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:10:58.153 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:58.411 1+0 records in 00:10:58.411 1+0 records out 00:10:58.411 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000423867 s, 9.7 MB/s 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:58.411 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:58.669 1+0 records in 00:10:58.669 1+0 records out 00:10:58.669 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000348456 s, 11.8 MB/s 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:58.669 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:58.670 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:58.670 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:58.670 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:58.670 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:58.670 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:58.670 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:58.928 1+0 records in 00:10:58.928 1+0 records out 00:10:58.928 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000386718 s, 10.6 MB/s 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:58.928 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:59.186 1+0 records in 00:10:59.186 1+0 records out 00:10:59.186 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000469962 s, 8.7 MB/s 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:59.186 05:39:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:59.444 1+0 records in 00:10:59.444 1+0 records out 00:10:59.444 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000513512 s, 8.0 MB/s 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.444 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:59.445 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.445 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:59.445 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:59.445 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:59.445 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:59.445 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:59.702 1+0 records in 00:10:59.702 1+0 records out 00:10:59.702 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000449903 s, 9.1 MB/s 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:59.702 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:59.960 1+0 records in 00:10:59.960 1+0 records out 00:10:59.960 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000588322 s, 7.0 MB/s 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:59.960 05:39:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:00.218 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:00.218 1+0 records in 00:11:00.219 1+0 records out 00:11:00.219 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000569823 s, 7.2 MB/s 00:11:00.219 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:00.219 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:00.219 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:00.219 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:00.219 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:00.219 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:00.219 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:00.219 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:11:00.476 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:00.477 1+0 records in 00:11:00.477 1+0 records out 00:11:00.477 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000648264 s, 6.3 MB/s 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:00.477 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:00.736 1+0 records in 00:11:00.736 1+0 records out 00:11:00.736 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000724005 s, 5.7 MB/s 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:00.736 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:00.994 1+0 records in 00:11:00.994 1+0 records out 00:11:00.994 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000602777 s, 6.8 MB/s 00:11:00.994 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:01.252 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:01.252 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:01.252 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:01.252 05:39:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:01.252 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:01.252 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:01.252 05:39:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:01.510 1+0 records in 00:11:01.510 1+0 records out 00:11:01.510 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000817734 s, 5.0 MB/s 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd0", 00:11:01.510 "bdev_name": "Malloc0" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd1", 00:11:01.510 "bdev_name": "Malloc1p0" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd2", 00:11:01.510 "bdev_name": "Malloc1p1" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd3", 00:11:01.510 "bdev_name": "Malloc2p0" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd4", 00:11:01.510 "bdev_name": "Malloc2p1" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd5", 00:11:01.510 "bdev_name": "Malloc2p2" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd6", 00:11:01.510 "bdev_name": "Malloc2p3" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd7", 00:11:01.510 "bdev_name": "Malloc2p4" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd8", 00:11:01.510 "bdev_name": "Malloc2p5" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd9", 00:11:01.510 "bdev_name": "Malloc2p6" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd10", 00:11:01.510 "bdev_name": "Malloc2p7" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd11", 00:11:01.510 "bdev_name": "TestPT" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd12", 00:11:01.510 "bdev_name": "raid0" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd13", 00:11:01.510 "bdev_name": "concat0" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd14", 00:11:01.510 "bdev_name": "raid1" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd15", 00:11:01.510 "bdev_name": "AIO0" 00:11:01.510 } 00:11:01.510 ]' 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:11:01.510 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd0", 00:11:01.510 "bdev_name": "Malloc0" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd1", 00:11:01.510 "bdev_name": "Malloc1p0" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd2", 00:11:01.510 "bdev_name": "Malloc1p1" 00:11:01.510 }, 00:11:01.510 { 00:11:01.510 "nbd_device": "/dev/nbd3", 00:11:01.510 "bdev_name": "Malloc2p0" 00:11:01.510 }, 00:11:01.510 { 00:11:01.511 "nbd_device": "/dev/nbd4", 00:11:01.511 "bdev_name": "Malloc2p1" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd5", 00:11:01.511 "bdev_name": "Malloc2p2" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd6", 00:11:01.511 "bdev_name": "Malloc2p3" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd7", 00:11:01.511 "bdev_name": "Malloc2p4" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd8", 00:11:01.511 "bdev_name": "Malloc2p5" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd9", 00:11:01.511 "bdev_name": "Malloc2p6" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd10", 00:11:01.511 "bdev_name": "Malloc2p7" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd11", 00:11:01.511 "bdev_name": "TestPT" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd12", 00:11:01.511 "bdev_name": "raid0" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd13", 00:11:01.511 "bdev_name": "concat0" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd14", 00:11:01.511 "bdev_name": "raid1" 00:11:01.511 }, 00:11:01.511 { 00:11:01.511 "nbd_device": "/dev/nbd15", 00:11:01.511 "bdev_name": "AIO0" 00:11:01.511 } 00:11:01.511 ]' 00:11:01.511 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:11:01.511 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:11:01.511 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:01.511 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:11:01.511 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:01.511 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:11:01.511 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:01.511 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:11:01.769 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:01.769 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:01.769 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:01.770 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:01.770 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:01.770 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:01.770 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:01.770 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:01.770 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:01.770 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:02.033 05:39:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:02.292 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:02.551 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:02.809 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:03.067 05:39:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:03.331 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:03.590 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:11:03.848 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:11:03.849 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:11:03.849 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:03.849 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:03.849 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:11:03.849 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:03.849 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:03.849 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:03.849 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:04.107 05:39:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:04.366 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:11:04.625 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:11:04.625 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:11:04.625 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:11:04.625 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:04.625 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:04.625 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:11:04.625 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:04.625 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:04.625 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:04.626 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:04.885 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:05.145 05:39:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:05.404 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:05.404 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:05.404 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:05.404 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:05.405 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:11:05.664 /dev/nbd0 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:05.664 1+0 records in 00:11:05.664 1+0 records out 00:11:05.664 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000186785 s, 21.9 MB/s 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:05.664 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:11:05.924 /dev/nbd1 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:05.924 1+0 records in 00:11:05.924 1+0 records out 00:11:05.924 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028708 s, 14.3 MB/s 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:05.924 05:39:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:11:06.183 /dev/nbd10 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:06.183 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:06.442 1+0 records in 00:11:06.442 1+0 records out 00:11:06.442 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317147 s, 12.9 MB/s 00:11:06.442 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:06.442 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:06.442 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:06.442 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:06.442 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:06.442 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:06.442 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:06.442 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:11:06.442 /dev/nbd11 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:06.701 1+0 records in 00:11:06.701 1+0 records out 00:11:06.701 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324735 s, 12.6 MB/s 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:06.701 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:11:06.960 /dev/nbd12 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:06.960 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:06.960 1+0 records in 00:11:06.960 1+0 records out 00:11:06.961 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000376239 s, 10.9 MB/s 00:11:06.961 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:06.961 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:06.961 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:06.961 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:06.961 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:06.961 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:06.961 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:06.961 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:11:07.219 /dev/nbd13 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:07.219 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:07.219 1+0 records in 00:11:07.219 1+0 records out 00:11:07.220 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000372911 s, 11.0 MB/s 00:11:07.220 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.220 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:07.220 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.220 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:07.220 05:39:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:07.220 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:07.220 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:07.220 05:39:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:11:07.479 /dev/nbd14 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:07.479 1+0 records in 00:11:07.479 1+0 records out 00:11:07.479 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000383441 s, 10.7 MB/s 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:07.479 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:11:07.739 /dev/nbd15 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:07.739 1+0 records in 00:11:07.739 1+0 records out 00:11:07.739 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000368962 s, 11.1 MB/s 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:07.739 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:11:07.998 /dev/nbd2 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:07.998 1+0 records in 00:11:07.998 1+0 records out 00:11:07.998 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000434562 s, 9.4 MB/s 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:07.998 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:11:08.256 /dev/nbd3 00:11:08.256 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:11:08.256 05:39:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:11:08.256 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:11:08.256 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:08.256 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:08.256 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:08.256 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:11:08.256 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:08.257 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:08.257 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:08.257 05:39:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:08.257 1+0 records in 00:11:08.257 1+0 records out 00:11:08.257 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000627951 s, 6.5 MB/s 00:11:08.257 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:08.257 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:08.257 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:08.257 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:08.257 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:08.257 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:08.257 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:08.257 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:11:08.515 /dev/nbd4 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:08.515 1+0 records in 00:11:08.515 1+0 records out 00:11:08.515 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000621146 s, 6.6 MB/s 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:08.515 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:11:08.773 /dev/nbd5 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:08.773 1+0 records in 00:11:08.773 1+0 records out 00:11:08.773 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000711568 s, 5.8 MB/s 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:08.773 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:11:09.032 /dev/nbd6 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:09.032 1+0 records in 00:11:09.032 1+0 records out 00:11:09.032 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000699195 s, 5.9 MB/s 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:09.032 05:39:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:11:09.290 /dev/nbd7 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:09.290 1+0 records in 00:11:09.290 1+0 records out 00:11:09.290 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000559272 s, 7.3 MB/s 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:09.290 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:11:09.549 /dev/nbd8 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:09.549 1+0 records in 00:11:09.549 1+0 records out 00:11:09.549 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000609157 s, 6.7 MB/s 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:09.549 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:11:09.807 /dev/nbd9 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:09.807 1+0 records in 00:11:09.807 1+0 records out 00:11:09.807 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000622852 s, 6.6 MB/s 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:11:09.807 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:09.808 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:09.808 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:10.066 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:10.066 { 00:11:10.066 "nbd_device": "/dev/nbd0", 00:11:10.066 "bdev_name": "Malloc0" 00:11:10.066 }, 00:11:10.066 { 00:11:10.066 "nbd_device": "/dev/nbd1", 00:11:10.066 "bdev_name": "Malloc1p0" 00:11:10.066 }, 00:11:10.066 { 00:11:10.066 "nbd_device": "/dev/nbd10", 00:11:10.066 "bdev_name": "Malloc1p1" 00:11:10.066 }, 00:11:10.066 { 00:11:10.066 "nbd_device": "/dev/nbd11", 00:11:10.066 "bdev_name": "Malloc2p0" 00:11:10.066 }, 00:11:10.066 { 00:11:10.066 "nbd_device": "/dev/nbd12", 00:11:10.066 "bdev_name": "Malloc2p1" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd13", 00:11:10.067 "bdev_name": "Malloc2p2" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd14", 00:11:10.067 "bdev_name": "Malloc2p3" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd15", 00:11:10.067 "bdev_name": "Malloc2p4" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd2", 00:11:10.067 "bdev_name": "Malloc2p5" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd3", 00:11:10.067 "bdev_name": "Malloc2p6" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd4", 00:11:10.067 "bdev_name": "Malloc2p7" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd5", 00:11:10.067 "bdev_name": "TestPT" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd6", 00:11:10.067 "bdev_name": "raid0" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd7", 00:11:10.067 "bdev_name": "concat0" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd8", 00:11:10.067 "bdev_name": "raid1" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd9", 00:11:10.067 "bdev_name": "AIO0" 00:11:10.067 } 00:11:10.067 ]' 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd0", 00:11:10.067 "bdev_name": "Malloc0" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd1", 00:11:10.067 "bdev_name": "Malloc1p0" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd10", 00:11:10.067 "bdev_name": "Malloc1p1" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd11", 00:11:10.067 "bdev_name": "Malloc2p0" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd12", 00:11:10.067 "bdev_name": "Malloc2p1" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd13", 00:11:10.067 "bdev_name": "Malloc2p2" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd14", 00:11:10.067 "bdev_name": "Malloc2p3" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd15", 00:11:10.067 "bdev_name": "Malloc2p4" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd2", 00:11:10.067 "bdev_name": "Malloc2p5" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd3", 00:11:10.067 "bdev_name": "Malloc2p6" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd4", 00:11:10.067 "bdev_name": "Malloc2p7" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd5", 00:11:10.067 "bdev_name": "TestPT" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd6", 00:11:10.067 "bdev_name": "raid0" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd7", 00:11:10.067 "bdev_name": "concat0" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd8", 00:11:10.067 "bdev_name": "raid1" 00:11:10.067 }, 00:11:10.067 { 00:11:10.067 "nbd_device": "/dev/nbd9", 00:11:10.067 "bdev_name": "AIO0" 00:11:10.067 } 00:11:10.067 ]' 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:11:10.067 /dev/nbd1 00:11:10.067 /dev/nbd10 00:11:10.067 /dev/nbd11 00:11:10.067 /dev/nbd12 00:11:10.067 /dev/nbd13 00:11:10.067 /dev/nbd14 00:11:10.067 /dev/nbd15 00:11:10.067 /dev/nbd2 00:11:10.067 /dev/nbd3 00:11:10.067 /dev/nbd4 00:11:10.067 /dev/nbd5 00:11:10.067 /dev/nbd6 00:11:10.067 /dev/nbd7 00:11:10.067 /dev/nbd8 00:11:10.067 /dev/nbd9' 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:11:10.067 /dev/nbd1 00:11:10.067 /dev/nbd10 00:11:10.067 /dev/nbd11 00:11:10.067 /dev/nbd12 00:11:10.067 /dev/nbd13 00:11:10.067 /dev/nbd14 00:11:10.067 /dev/nbd15 00:11:10.067 /dev/nbd2 00:11:10.067 /dev/nbd3 00:11:10.067 /dev/nbd4 00:11:10.067 /dev/nbd5 00:11:10.067 /dev/nbd6 00:11:10.067 /dev/nbd7 00:11:10.067 /dev/nbd8 00:11:10.067 /dev/nbd9' 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:11:10.067 256+0 records in 00:11:10.067 256+0 records out 00:11:10.067 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115032 s, 91.2 MB/s 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:10.067 05:39:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:11:10.354 256+0 records in 00:11:10.354 256+0 records out 00:11:10.354 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181911 s, 5.8 MB/s 00:11:10.354 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:10.354 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:11:10.629 256+0 records in 00:11:10.629 256+0 records out 00:11:10.629 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.174315 s, 6.0 MB/s 00:11:10.629 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:10.629 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:11:10.629 256+0 records in 00:11:10.629 256+0 records out 00:11:10.629 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.176413 s, 5.9 MB/s 00:11:10.629 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:10.629 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:11:10.887 256+0 records in 00:11:10.887 256+0 records out 00:11:10.887 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18422 s, 5.7 MB/s 00:11:10.887 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:10.887 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:11:11.144 256+0 records in 00:11:11.144 256+0 records out 00:11:11.144 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121633 s, 8.6 MB/s 00:11:11.144 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:11.144 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:11:11.144 256+0 records in 00:11:11.144 256+0 records out 00:11:11.144 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.136615 s, 7.7 MB/s 00:11:11.144 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:11.144 05:39:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:11:11.403 256+0 records in 00:11:11.403 256+0 records out 00:11:11.403 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.153592 s, 6.8 MB/s 00:11:11.403 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:11.403 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:11:11.403 256+0 records in 00:11:11.403 256+0 records out 00:11:11.403 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183542 s, 5.7 MB/s 00:11:11.403 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:11.403 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:11:11.660 256+0 records in 00:11:11.660 256+0 records out 00:11:11.660 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183096 s, 5.7 MB/s 00:11:11.660 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:11.660 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:11:11.918 256+0 records in 00:11:11.918 256+0 records out 00:11:11.918 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184124 s, 5.7 MB/s 00:11:11.918 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:11.918 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:11:12.176 256+0 records in 00:11:12.176 256+0 records out 00:11:12.176 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183835 s, 5.7 MB/s 00:11:12.176 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:12.176 05:39:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:11:12.176 256+0 records in 00:11:12.176 256+0 records out 00:11:12.176 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.174566 s, 6.0 MB/s 00:11:12.176 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:12.176 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:11:12.434 256+0 records in 00:11:12.434 256+0 records out 00:11:12.434 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.110191 s, 9.5 MB/s 00:11:12.434 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:12.434 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:11:12.434 256+0 records in 00:11:12.434 256+0 records out 00:11:12.434 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185129 s, 5.7 MB/s 00:11:12.692 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:12.692 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:11:12.692 256+0 records in 00:11:12.692 256+0 records out 00:11:12.692 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.102627 s, 10.2 MB/s 00:11:12.692 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:11:12.692 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:11:12.951 256+0 records in 00:11:12.951 256+0 records out 00:11:12.951 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.15113 s, 6.9 MB/s 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:12.951 05:39:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:13.210 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:13.468 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:13.727 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:11:13.985 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:11:13.985 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:11:13.986 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:11:13.986 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:13.986 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:13.986 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:11:13.986 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:13.986 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:13.986 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:13.986 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:11:14.244 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:11:14.244 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:11:14.244 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:11:14.244 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:14.244 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:14.244 05:39:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:11:14.244 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:14.244 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:14.244 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:14.244 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:14.502 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:14.760 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:15.019 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:15.277 05:39:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:15.536 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:15.795 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:11:16.053 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:11:16.312 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:11:16.312 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:11:16.312 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:16.312 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:16.312 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:11:16.312 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:16.312 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:16.312 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:16.312 05:39:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:16.570 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:16.828 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:17.087 05:39:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:11:17.345 malloc_lvol_verify 00:11:17.345 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:11:17.603 ca699aaf-5bfd-4fa9-b4b6-62a6e4553793 00:11:17.603 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:11:17.861 ede7e396-695f-47e7-afdd-92c32a179c61 00:11:17.861 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:11:17.861 /dev/nbd0 00:11:18.119 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:11:18.119 mke2fs 1.46.5 (30-Dec-2021) 00:11:18.119 Discarding device blocks: 0/4096 done 00:11:18.119 Creating filesystem with 4096 1k blocks and 1024 inodes 00:11:18.119 00:11:18.119 Allocating group tables: 0/1 done 00:11:18.119 Writing inode tables: 0/1 done 00:11:18.119 Creating journal (1024 blocks): done 00:11:18.119 Writing superblocks and filesystem accounting information: 0/1 done 00:11:18.119 00:11:18.119 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:11:18.119 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:11:18.119 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:11:18.119 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:18.119 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:18.119 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:11:18.119 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:18.119 05:39:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1109089 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1109089 ']' 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1109089 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1109089 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1109089' 00:11:18.378 killing process with pid 1109089 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1109089 00:11:18.378 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1109089 00:11:18.946 05:39:33 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:11:18.946 00:11:18.946 real 0m22.904s 00:11:18.946 user 0m27.676s 00:11:18.946 sys 0m13.403s 00:11:18.946 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:18.946 05:39:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:11:18.946 ************************************ 00:11:18.946 END TEST bdev_nbd 00:11:18.946 ************************************ 00:11:18.946 05:39:33 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:18.946 05:39:33 blockdev_general -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:11:18.946 05:39:33 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = nvme ']' 00:11:18.946 05:39:33 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = gpt ']' 00:11:18.946 05:39:33 blockdev_general -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:11:18.946 05:39:33 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:18.946 05:39:33 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:18.946 05:39:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:18.946 ************************************ 00:11:18.946 START TEST bdev_fio 00:11:18.946 ************************************ 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:11:18.946 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:11:18.946 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc0]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc0 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p0]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p0 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p1]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p1 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p0]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p0 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p1]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p1 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p2]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p2 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p3]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p3 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p4]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p4 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p5]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p5 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p6]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p6 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p7]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p7 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_TestPT]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=TestPT 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid0]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid0 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_concat0]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=concat0 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid1]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid1 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_AIO0]' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=AIO0 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:18.947 05:39:33 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:11:18.947 ************************************ 00:11:18.947 START TEST bdev_fio_rw_verify 00:11:18.947 ************************************ 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:11:18.947 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:11:19.205 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:11:19.205 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:11:19.205 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:11:19.205 05:39:33 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:11:19.463 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:19.463 fio-3.35 00:11:19.463 Starting 16 threads 00:11:31.656 00:11:31.656 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1112926: Fri Jul 26 05:39:45 2024 00:11:31.656 read: IOPS=89.2k, BW=349MiB/s (365MB/s)(3486MiB/10001msec) 00:11:31.656 slat (usec): min=2, max=876, avg=36.73, stdev=15.61 00:11:31.656 clat (usec): min=10, max=1427, avg=293.23, stdev=141.74 00:11:31.656 lat (usec): min=17, max=1466, avg=329.97, stdev=150.94 00:11:31.656 clat percentiles (usec): 00:11:31.656 | 50.000th=[ 285], 99.000th=[ 603], 99.900th=[ 709], 99.990th=[ 816], 00:11:31.656 | 99.999th=[ 889] 00:11:31.656 write: IOPS=138k, BW=541MiB/s (567MB/s)(5342MiB/9882msec); 0 zone resets 00:11:31.656 slat (usec): min=7, max=1356, avg=49.96, stdev=16.43 00:11:31.656 clat (usec): min=7, max=4489, avg=347.73, stdev=165.70 00:11:31.656 lat (usec): min=19, max=4536, avg=397.68, stdev=174.82 00:11:31.656 clat percentiles (usec): 00:11:31.656 | 50.000th=[ 334], 99.000th=[ 783], 99.900th=[ 955], 99.990th=[ 1057], 00:11:31.656 | 99.999th=[ 1401] 00:11:31.657 bw ( KiB/s): min=447792, max=730663, per=99.11%, avg=548606.89, stdev=4628.57, samples=304 00:11:31.657 iops : min=111950, max=182663, avg=137151.63, stdev=1157.11, samples=304 00:11:31.657 lat (usec) : 10=0.01%, 20=0.01%, 50=0.80%, 100=5.41%, 250=29.93% 00:11:31.657 lat (usec) : 500=48.90%, 750=14.12%, 1000=0.81% 00:11:31.657 lat (msec) : 2=0.02%, 4=0.01%, 10=0.01% 00:11:31.657 cpu : usr=99.21%, sys=0.38%, ctx=693, majf=0, minf=2669 00:11:31.657 IO depths : 1=12.4%, 2=24.8%, 4=50.2%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:11:31.657 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:11:31.657 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:11:31.657 issued rwts: total=892375,1367497,0,0 short=0,0,0,0 dropped=0,0,0,0 00:11:31.657 latency : target=0, window=0, percentile=100.00%, depth=8 00:11:31.657 00:11:31.657 Run status group 0 (all jobs): 00:11:31.657 READ: bw=349MiB/s (365MB/s), 349MiB/s-349MiB/s (365MB/s-365MB/s), io=3486MiB (3655MB), run=10001-10001msec 00:11:31.657 WRITE: bw=541MiB/s (567MB/s), 541MiB/s-541MiB/s (567MB/s-567MB/s), io=5342MiB (5601MB), run=9882-9882msec 00:11:31.657 00:11:31.657 real 0m12.059s 00:11:31.657 user 2m45.507s 00:11:31.657 sys 0m1.497s 00:11:31.657 05:39:45 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:31.657 05:39:45 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:11:31.657 ************************************ 00:11:31.657 END TEST bdev_fio_rw_verify 00:11:31.657 ************************************ 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:11:31.657 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:11:31.658 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "c8475de8-e34b-4b81-b660-8290134c4e77"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c8475de8-e34b-4b81-b660-8290134c4e77",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "87f592cb-cf24-5d45-a81a-491c1ec8c3bb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "87f592cb-cf24-5d45-a81a-491c1ec8c3bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "fac21681-e151-5569-9087-ab68dbd94c96"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "fac21681-e151-5569-9087-ab68dbd94c96",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "c274856f-58c9-5bc4-a782-fb664d8cb0f6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c274856f-58c9-5bc4-a782-fb664d8cb0f6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "d94755ba-326b-5eec-b39f-d25eb163d59c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d94755ba-326b-5eec-b39f-d25eb163d59c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "e7a89fb4-0564-58e2-bf85-7d5809fdd2b2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e7a89fb4-0564-58e2-bf85-7d5809fdd2b2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "761a5cb1-f5d6-5263-a54a-6079f514564f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "761a5cb1-f5d6-5263-a54a-6079f514564f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "bb826987-bba1-546d-bb87-189a4ad5235d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bb826987-bba1-546d-bb87-189a4ad5235d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "d3fa22eb-649a-522d-840e-1b7867c79a16"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d3fa22eb-649a-522d-840e-1b7867c79a16",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "df551536-4f54-596a-a698-60552d6d1424"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "df551536-4f54-596a-a698-60552d6d1424",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "39deb65a-b77c-5eb8-8a0f-39e3c0e08532"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "39deb65a-b77c-5eb8-8a0f-39e3c0e08532",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "2c103ac2-74ca-5737-913b-e1196f4f45aa"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2c103ac2-74ca-5737-913b-e1196f4f45aa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "118a9b42-bab2-4784-bba4-c933a9194ae7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "118a9b42-bab2-4784-bba4-c933a9194ae7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "118a9b42-bab2-4784-bba4-c933a9194ae7",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "048ec738-d116-49ab-b221-6310c0a9a37f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "c896e5f6-1868-43b8-81e8-ef121b01063b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "4da6e7d3-2fc7-45ae-92c1-d87cb5020bb0"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4da6e7d3-2fc7-45ae-92c1-d87cb5020bb0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4da6e7d3-2fc7-45ae-92c1-d87cb5020bb0",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "648893a9-ec8a-482c-89dc-9db791def4bb",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "e76582b2-4c5b-4f07-994e-f872ca8f536e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "6a8e405c-fba3-431c-8043-2b09bbc51be0"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6a8e405c-fba3-431c-8043-2b09bbc51be0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "6a8e405c-fba3-431c-8043-2b09bbc51be0",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "f7508fd8-ce27-48fb-8445-3355eb0fb4a2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "d4831adb-d523-467f-813d-05619e0a9b46",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "7781c77e-3ca2-46bf-819d-9341c4c8a87b"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "7781c77e-3ca2-46bf-819d-9341c4c8a87b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:11:31.658 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n Malloc0 00:11:31.658 Malloc1p0 00:11:31.658 Malloc1p1 00:11:31.658 Malloc2p0 00:11:31.658 Malloc2p1 00:11:31.658 Malloc2p2 00:11:31.658 Malloc2p3 00:11:31.658 Malloc2p4 00:11:31.658 Malloc2p5 00:11:31.658 Malloc2p6 00:11:31.658 Malloc2p7 00:11:31.658 TestPT 00:11:31.658 raid0 00:11:31.658 concat0 ]] 00:11:31.658 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "c8475de8-e34b-4b81-b660-8290134c4e77"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c8475de8-e34b-4b81-b660-8290134c4e77",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "87f592cb-cf24-5d45-a81a-491c1ec8c3bb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "87f592cb-cf24-5d45-a81a-491c1ec8c3bb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "fac21681-e151-5569-9087-ab68dbd94c96"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "fac21681-e151-5569-9087-ab68dbd94c96",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "c274856f-58c9-5bc4-a782-fb664d8cb0f6"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c274856f-58c9-5bc4-a782-fb664d8cb0f6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "d94755ba-326b-5eec-b39f-d25eb163d59c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d94755ba-326b-5eec-b39f-d25eb163d59c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "e7a89fb4-0564-58e2-bf85-7d5809fdd2b2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e7a89fb4-0564-58e2-bf85-7d5809fdd2b2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "761a5cb1-f5d6-5263-a54a-6079f514564f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "761a5cb1-f5d6-5263-a54a-6079f514564f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "bb826987-bba1-546d-bb87-189a4ad5235d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bb826987-bba1-546d-bb87-189a4ad5235d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "d3fa22eb-649a-522d-840e-1b7867c79a16"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d3fa22eb-649a-522d-840e-1b7867c79a16",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "df551536-4f54-596a-a698-60552d6d1424"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "df551536-4f54-596a-a698-60552d6d1424",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "39deb65a-b77c-5eb8-8a0f-39e3c0e08532"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "39deb65a-b77c-5eb8-8a0f-39e3c0e08532",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "2c103ac2-74ca-5737-913b-e1196f4f45aa"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "2c103ac2-74ca-5737-913b-e1196f4f45aa",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "118a9b42-bab2-4784-bba4-c933a9194ae7"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "118a9b42-bab2-4784-bba4-c933a9194ae7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "118a9b42-bab2-4784-bba4-c933a9194ae7",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "048ec738-d116-49ab-b221-6310c0a9a37f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "c896e5f6-1868-43b8-81e8-ef121b01063b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "4da6e7d3-2fc7-45ae-92c1-d87cb5020bb0"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "4da6e7d3-2fc7-45ae-92c1-d87cb5020bb0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "4da6e7d3-2fc7-45ae-92c1-d87cb5020bb0",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "648893a9-ec8a-482c-89dc-9db791def4bb",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "e76582b2-4c5b-4f07-994e-f872ca8f536e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "6a8e405c-fba3-431c-8043-2b09bbc51be0"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6a8e405c-fba3-431c-8043-2b09bbc51be0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "6a8e405c-fba3-431c-8043-2b09bbc51be0",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "f7508fd8-ce27-48fb-8445-3355eb0fb4a2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "d4831adb-d523-467f-813d-05619e0a9b46",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "7781c77e-3ca2-46bf-819d-9341c4c8a87b"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "7781c77e-3ca2-46bf-819d-9341c4c8a87b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc0]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc0 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p0]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p0 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p1]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p1 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p0]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p0 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p1]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p1 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p2]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p2 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p3]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p3 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p4]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p4 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p5]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p5 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p6]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p6 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p7]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p7 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_TestPT]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=TestPT 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_raid0]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=raid0 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_concat0]' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=concat0 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:31.660 05:39:45 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:11:31.660 ************************************ 00:11:31.660 START TEST bdev_fio_trim 00:11:31.660 ************************************ 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:11:31.660 05:39:46 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:11:31.660 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:11:31.660 fio-3.35 00:11:31.661 Starting 14 threads 00:11:43.914 00:11:43.914 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1114634: Fri Jul 26 05:39:57 2024 00:11:43.914 write: IOPS=122k, BW=477MiB/s (501MB/s)(4775MiB/10002msec); 0 zone resets 00:11:43.914 slat (usec): min=3, max=3830, avg=40.75, stdev=11.60 00:11:43.914 clat (usec): min=26, max=4233, avg=286.02, stdev=99.54 00:11:43.914 lat (usec): min=35, max=4275, avg=326.77, stdev=104.08 00:11:43.914 clat percentiles (usec): 00:11:43.914 | 50.000th=[ 277], 99.000th=[ 523], 99.900th=[ 619], 99.990th=[ 685], 00:11:43.914 | 99.999th=[ 1450] 00:11:43.914 bw ( KiB/s): min=445712, max=700619, per=100.00%, avg=490564.68, stdev=4126.46, samples=266 00:11:43.914 iops : min=111430, max=175152, avg=122640.89, stdev=1031.58, samples=266 00:11:43.914 trim: IOPS=122k, BW=477MiB/s (501MB/s)(4775MiB/10002msec); 0 zone resets 00:11:43.914 slat (usec): min=5, max=210, avg=27.67, stdev= 7.50 00:11:43.914 clat (usec): min=6, max=4275, avg=322.47, stdev=108.30 00:11:43.914 lat (usec): min=21, max=4302, avg=350.15, stdev=111.65 00:11:43.914 clat percentiles (usec): 00:11:43.914 | 50.000th=[ 314], 99.000th=[ 578], 99.900th=[ 685], 99.990th=[ 750], 00:11:43.914 | 99.999th=[ 979] 00:11:43.914 bw ( KiB/s): min=445720, max=700619, per=100.00%, avg=490565.11, stdev=4126.50, samples=266 00:11:43.914 iops : min=111430, max=175152, avg=122641.00, stdev=1031.60, samples=266 00:11:43.914 lat (usec) : 10=0.01%, 20=0.01%, 50=0.05%, 100=0.80%, 250=33.47% 00:11:43.914 lat (usec) : 500=62.24%, 750=3.43%, 1000=0.01% 00:11:43.914 lat (msec) : 2=0.01%, 4=0.01%, 10=0.01% 00:11:43.914 cpu : usr=99.56%, sys=0.00%, ctx=503, majf=0, minf=1031 00:11:43.914 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:11:43.914 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:11:43.914 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:11:43.914 issued rwts: total=0,1222523,1222527,0 short=0,0,0,0 dropped=0,0,0,0 00:11:43.914 latency : target=0, window=0, percentile=100.00%, depth=8 00:11:43.914 00:11:43.914 Run status group 0 (all jobs): 00:11:43.914 WRITE: bw=477MiB/s (501MB/s), 477MiB/s-477MiB/s (501MB/s-501MB/s), io=4775MiB (5007MB), run=10002-10002msec 00:11:43.914 TRIM: bw=477MiB/s (501MB/s), 477MiB/s-477MiB/s (501MB/s-501MB/s), io=4775MiB (5007MB), run=10002-10002msec 00:11:43.914 00:11:43.914 real 0m11.456s 00:11:43.914 user 2m25.635s 00:11:43.914 sys 0m0.751s 00:11:43.914 05:39:57 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:43.914 05:39:57 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:11:43.914 ************************************ 00:11:43.914 END TEST bdev_fio_trim 00:11:43.914 ************************************ 00:11:43.914 05:39:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:11:43.914 05:39:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:11:43.914 05:39:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:11:43.914 05:39:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:11:43.914 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:11:43.914 05:39:57 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:11:43.914 00:11:43.914 real 0m23.868s 00:11:43.914 user 5m11.336s 00:11:43.914 sys 0m2.438s 00:11:43.914 05:39:57 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:43.914 05:39:57 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:11:43.914 ************************************ 00:11:43.914 END TEST bdev_fio 00:11:43.914 ************************************ 00:11:43.914 05:39:57 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:43.914 05:39:57 blockdev_general -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:11:43.914 05:39:57 blockdev_general -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:11:43.914 05:39:57 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:11:43.914 05:39:57 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:43.914 05:39:57 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:43.914 ************************************ 00:11:43.914 START TEST bdev_verify 00:11:43.914 ************************************ 00:11:43.914 05:39:57 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:11:43.914 [2024-07-26 05:39:57.667779] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:11:43.914 [2024-07-26 05:39:57.667844] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1116079 ] 00:11:43.914 [2024-07-26 05:39:57.799300] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:43.914 [2024-07-26 05:39:57.907467] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:43.914 [2024-07-26 05:39:57.907473] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:43.914 [2024-07-26 05:39:58.074767] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:43.914 [2024-07-26 05:39:58.074831] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:43.914 [2024-07-26 05:39:58.074847] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:43.914 [2024-07-26 05:39:58.082777] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:43.914 [2024-07-26 05:39:58.082807] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:43.914 [2024-07-26 05:39:58.090790] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:43.914 [2024-07-26 05:39:58.090817] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:43.914 [2024-07-26 05:39:58.168226] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:43.914 [2024-07-26 05:39:58.168280] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:43.914 [2024-07-26 05:39:58.168300] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a597e0 00:11:43.915 [2024-07-26 05:39:58.168312] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:43.915 [2024-07-26 05:39:58.170024] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:43.915 [2024-07-26 05:39:58.170059] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:43.915 Running I/O for 5 seconds... 00:11:49.184 00:11:49.184 Latency(us) 00:11:49.184 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:49.184 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x0 length 0x1000 00:11:49.184 Malloc0 : 5.13 1023.13 4.00 0.00 0.00 124814.28 548.51 289954.06 00:11:49.184 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x1000 length 0x1000 00:11:49.184 Malloc0 : 5.11 1001.13 3.91 0.00 0.00 127554.58 573.44 461373.44 00:11:49.184 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x0 length 0x800 00:11:49.184 Malloc1p0 : 5.13 523.83 2.05 0.00 0.00 242842.66 3647.22 277188.79 00:11:49.184 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x800 length 0x800 00:11:49.184 Malloc1p0 : 5.12 525.37 2.05 0.00 0.00 242180.02 3704.21 260776.29 00:11:49.184 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x0 length 0x800 00:11:49.184 Malloc1p1 : 5.13 523.61 2.05 0.00 0.00 242096.01 3590.23 269894.34 00:11:49.184 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x800 length 0x800 00:11:49.184 Malloc1p1 : 5.12 525.15 2.05 0.00 0.00 241384.02 3618.73 257129.07 00:11:49.184 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x0 length 0x200 00:11:49.184 Malloc2p0 : 5.14 523.39 2.04 0.00 0.00 241381.89 3462.01 266247.12 00:11:49.184 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x200 length 0x200 00:11:49.184 Malloc2p0 : 5.12 524.91 2.05 0.00 0.00 240680.75 3490.50 251658.24 00:11:49.184 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x0 length 0x200 00:11:49.184 Malloc2p1 : 5.14 523.18 2.04 0.00 0.00 240692.38 3533.25 262599.90 00:11:49.184 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x200 length 0x200 00:11:49.184 Malloc2p1 : 5.29 532.65 2.08 0.00 0.00 236515.44 3504.75 248011.02 00:11:49.184 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x0 length 0x200 00:11:49.184 Malloc2p2 : 5.14 522.96 2.04 0.00 0.00 239967.28 3462.01 258952.68 00:11:49.184 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x200 length 0x200 00:11:49.184 Malloc2p2 : 5.29 532.35 2.08 0.00 0.00 235852.04 3490.50 244363.80 00:11:49.184 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x0 length 0x200 00:11:49.184 Malloc2p3 : 5.29 532.71 2.08 0.00 0.00 234876.33 3476.26 253481.85 00:11:49.184 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x200 length 0x200 00:11:49.184 Malloc2p3 : 5.29 532.05 2.08 0.00 0.00 235166.26 3533.25 238892.97 00:11:49.184 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.184 Verification LBA range: start 0x0 length 0x200 00:11:49.185 Malloc2p4 : 5.29 532.41 2.08 0.00 0.00 234234.61 3419.27 251658.24 00:11:49.185 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x200 length 0x200 00:11:49.185 Malloc2p4 : 5.30 531.76 2.08 0.00 0.00 234516.55 3419.27 235245.75 00:11:49.185 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x0 length 0x200 00:11:49.185 Malloc2p5 : 5.29 532.12 2.08 0.00 0.00 233615.60 3462.01 249834.63 00:11:49.185 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x200 length 0x200 00:11:49.185 Malloc2p5 : 5.30 531.48 2.08 0.00 0.00 233884.38 3462.01 233422.14 00:11:49.185 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x0 length 0x200 00:11:49.185 Malloc2p6 : 5.29 531.83 2.08 0.00 0.00 233006.10 3533.25 246187.41 00:11:49.185 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x200 length 0x200 00:11:49.185 Malloc2p6 : 5.30 531.05 2.07 0.00 0.00 233328.45 3533.25 231598.53 00:11:49.185 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x0 length 0x200 00:11:49.185 Malloc2p7 : 5.30 531.55 2.08 0.00 0.00 232386.97 3504.75 238892.97 00:11:49.185 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x200 length 0x200 00:11:49.185 Malloc2p7 : 5.31 530.69 2.07 0.00 0.00 232738.54 3490.50 226127.69 00:11:49.185 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x0 length 0x1000 00:11:49.185 TestPT : 5.30 510.24 1.99 0.00 0.00 239979.19 23365.01 240716.58 00:11:49.185 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x1000 length 0x1000 00:11:49.185 TestPT : 5.32 507.23 1.98 0.00 0.00 242139.94 9061.06 311837.38 00:11:49.185 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x0 length 0x2000 00:11:49.185 raid0 : 5.30 531.01 2.07 0.00 0.00 230615.26 3547.49 208803.39 00:11:49.185 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x2000 length 0x2000 00:11:49.185 raid0 : 5.31 530.19 2.07 0.00 0.00 231016.03 3590.23 193302.71 00:11:49.185 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x0 length 0x2000 00:11:49.185 concat0 : 5.31 530.65 2.07 0.00 0.00 229990.46 3590.23 202420.76 00:11:49.185 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x2000 length 0x2000 00:11:49.185 concat0 : 5.31 529.90 2.07 0.00 0.00 230352.27 3604.48 185096.46 00:11:49.185 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x0 length 0x1000 00:11:49.185 raid1 : 5.31 530.25 2.07 0.00 0.00 229288.64 4245.59 194214.51 00:11:49.185 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x1000 length 0x1000 00:11:49.185 raid1 : 5.32 529.70 2.07 0.00 0.00 229607.36 4302.58 184184.65 00:11:49.185 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x0 length 0x4e2 00:11:49.185 AIO0 : 5.31 530.03 2.07 0.00 0.00 228595.24 1731.01 193302.71 00:11:49.185 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:49.185 Verification LBA range: start 0x4e2 length 0x4e2 00:11:49.185 AIO0 : 5.32 529.55 2.07 0.00 0.00 228834.87 1752.38 191479.10 00:11:49.185 =================================================================================================================== 00:11:49.185 Total : 17858.07 69.76 0.00 0.00 223212.92 548.51 461373.44 00:11:49.443 00:11:49.443 real 0m6.643s 00:11:49.443 user 0m12.251s 00:11:49.443 sys 0m0.406s 00:11:49.443 05:40:04 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:49.443 05:40:04 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:11:49.443 ************************************ 00:11:49.443 END TEST bdev_verify 00:11:49.443 ************************************ 00:11:49.443 05:40:04 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:49.443 05:40:04 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:11:49.443 05:40:04 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:11:49.443 05:40:04 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:49.443 05:40:04 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:49.443 ************************************ 00:11:49.443 START TEST bdev_verify_big_io 00:11:49.443 ************************************ 00:11:49.443 05:40:04 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:11:49.702 [2024-07-26 05:40:04.394210] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:11:49.702 [2024-07-26 05:40:04.394272] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1116976 ] 00:11:49.702 [2024-07-26 05:40:04.521959] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:49.961 [2024-07-26 05:40:04.625401] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:49.961 [2024-07-26 05:40:04.625407] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:49.961 [2024-07-26 05:40:04.783272] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:49.961 [2024-07-26 05:40:04.783330] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:49.961 [2024-07-26 05:40:04.783344] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:49.961 [2024-07-26 05:40:04.791285] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:49.961 [2024-07-26 05:40:04.791312] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:49.961 [2024-07-26 05:40:04.799299] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:49.961 [2024-07-26 05:40:04.799324] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:50.220 [2024-07-26 05:40:04.876587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:50.220 [2024-07-26 05:40:04.876646] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:50.220 [2024-07-26 05:40:04.876667] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b5f7e0 00:11:50.220 [2024-07-26 05:40:04.876680] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:50.220 [2024-07-26 05:40:04.878336] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:50.220 [2024-07-26 05:40:04.878368] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:50.220 [2024-07-26 05:40:05.082746] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.084148] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.086205] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.087697] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.089565] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.090623] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.092281] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.093927] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.095007] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.096629] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.097716] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.099273] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.100160] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.101563] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.102459] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:11:50.220 [2024-07-26 05:40:05.103868] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:11:50.479 [2024-07-26 05:40:05.127827] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:11:50.479 [2024-07-26 05:40:05.129840] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:11:50.479 Running I/O for 5 seconds... 00:11:58.590 00:11:58.590 Latency(us) 00:11:58.590 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:58.590 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x100 00:11:58.590 Malloc0 : 6.02 170.16 10.63 0.00 0.00 737149.96 876.19 1998677.04 00:11:58.590 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x100 length 0x100 00:11:58.590 Malloc0 : 5.98 149.93 9.37 0.00 0.00 837690.64 901.12 2275865.82 00:11:58.590 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x80 00:11:58.590 Malloc1p0 : 6.89 34.82 2.18 0.00 0.00 3303398.34 1481.68 5602131.26 00:11:58.590 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x80 length 0x80 00:11:58.590 Malloc1p0 : 6.28 87.27 5.45 0.00 0.00 1353083.23 2407.74 2698943.44 00:11:58.590 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x80 00:11:58.590 Malloc1p1 : 6.89 34.81 2.18 0.00 0.00 3192780.93 1481.68 5397886.89 00:11:58.590 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x80 length 0x80 00:11:58.590 Malloc1p1 : 6.73 35.68 2.23 0.00 0.00 3138184.61 1517.30 5397886.89 00:11:58.590 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x20 00:11:58.590 Malloc2p0 : 6.22 23.14 1.45 0.00 0.00 1212292.22 644.67 2159154.75 00:11:58.590 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x20 length 0x20 00:11:58.590 Malloc2p0 : 6.20 23.23 1.45 0.00 0.00 1213882.16 633.99 1984088.15 00:11:58.590 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x20 00:11:58.590 Malloc2p1 : 6.22 23.14 1.45 0.00 0.00 1201317.73 633.99 2129976.99 00:11:58.590 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x20 length 0x20 00:11:58.590 Malloc2p1 : 6.20 23.22 1.45 0.00 0.00 1203262.76 651.80 1954910.39 00:11:58.590 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x20 00:11:58.590 Malloc2p2 : 6.22 23.13 1.45 0.00 0.00 1189635.67 626.87 2100799.22 00:11:58.590 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x20 length 0x20 00:11:58.590 Malloc2p2 : 6.20 23.22 1.45 0.00 0.00 1192387.61 655.36 1925732.62 00:11:58.590 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x20 00:11:58.590 Malloc2p3 : 6.23 23.13 1.45 0.00 0.00 1179039.15 648.24 2071621.45 00:11:58.590 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x20 length 0x20 00:11:58.590 Malloc2p3 : 6.28 25.47 1.59 0.00 0.00 1093110.13 648.24 1911143.74 00:11:58.590 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x20 00:11:58.590 Malloc2p4 : 6.33 25.27 1.58 0.00 0.00 1081945.43 808.51 2042443.69 00:11:58.590 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x20 length 0x20 00:11:58.590 Malloc2p4 : 6.28 25.47 1.59 0.00 0.00 1083806.60 648.24 1881965.97 00:11:58.590 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x20 00:11:58.590 Malloc2p5 : 6.33 25.26 1.58 0.00 0.00 1071334.95 637.55 2027854.80 00:11:58.590 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x20 length 0x20 00:11:58.590 Malloc2p5 : 6.28 25.46 1.59 0.00 0.00 1073851.76 826.32 1852788.20 00:11:58.590 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x20 00:11:58.590 Malloc2p6 : 6.33 25.26 1.58 0.00 0.00 1061092.70 651.80 1998677.04 00:11:58.590 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x20 length 0x20 00:11:58.590 Malloc2p6 : 6.29 25.46 1.59 0.00 0.00 1063708.04 658.92 1830904.88 00:11:58.590 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x20 00:11:58.590 Malloc2p7 : 6.34 25.25 1.58 0.00 0.00 1050544.87 623.30 1969499.27 00:11:58.590 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x20 length 0x20 00:11:58.590 Malloc2p7 : 6.29 25.45 1.59 0.00 0.00 1054292.09 666.05 1801727.11 00:11:58.590 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x100 00:11:58.590 TestPT : 6.95 36.81 2.30 0.00 0.00 2738950.22 1495.93 4901864.85 00:11:58.590 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x100 length 0x100 00:11:58.590 TestPT : 6.76 33.41 2.09 0.00 0.00 3055020.53 103489.89 3501332.03 00:11:58.590 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x200 00:11:58.590 raid0 : 6.90 41.76 2.61 0.00 0.00 2363183.72 1609.91 4697620.48 00:11:58.590 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x200 length 0x200 00:11:58.590 raid0 : 6.84 39.79 2.49 0.00 0.00 2484686.49 1609.91 4755976.01 00:11:58.590 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x200 00:11:58.590 concat0 : 6.90 46.39 2.90 0.00 0.00 2107689.79 1595.66 4522553.88 00:11:58.590 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x200 length 0x200 00:11:58.590 concat0 : 6.84 46.81 2.93 0.00 0.00 2086151.89 1617.03 4580909.41 00:11:58.590 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x100 00:11:58.590 raid1 : 6.90 74.93 4.68 0.00 0.00 1262412.94 2023.07 4318309.51 00:11:58.590 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x100 length 0x100 00:11:58.590 raid1 : 6.77 66.92 4.18 0.00 0.00 1432222.01 2051.56 4405842.81 00:11:58.590 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x0 length 0x4e 00:11:58.590 AIO0 : 6.92 54.95 3.43 0.00 0.00 1019182.79 423.85 2888598.93 00:11:58.590 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:11:58.590 Verification LBA range: start 0x4e length 0x4e 00:11:58.590 AIO0 : 6.87 59.94 3.75 0.00 0.00 952524.06 744.40 2859421.16 00:11:58.590 =================================================================================================================== 00:11:58.590 Total : 1404.93 87.81 0.00 0.00 1484605.85 423.85 5602131.26 00:11:58.590 00:11:58.590 real 0m8.274s 00:11:58.590 user 0m15.514s 00:11:58.590 sys 0m0.411s 00:11:58.590 05:40:12 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:58.590 05:40:12 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:11:58.590 ************************************ 00:11:58.590 END TEST bdev_verify_big_io 00:11:58.590 ************************************ 00:11:58.590 05:40:12 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:58.590 05:40:12 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:58.590 05:40:12 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:11:58.591 05:40:12 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:58.591 05:40:12 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:58.591 ************************************ 00:11:58.591 START TEST bdev_write_zeroes 00:11:58.591 ************************************ 00:11:58.591 05:40:12 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:58.591 [2024-07-26 05:40:12.743544] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:11:58.591 [2024-07-26 05:40:12.743604] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1118047 ] 00:11:58.591 [2024-07-26 05:40:12.869610] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:58.591 [2024-07-26 05:40:12.970409] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:58.591 [2024-07-26 05:40:13.128347] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:58.591 [2024-07-26 05:40:13.128404] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:58.591 [2024-07-26 05:40:13.128419] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:58.591 [2024-07-26 05:40:13.136352] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:58.591 [2024-07-26 05:40:13.136383] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:58.591 [2024-07-26 05:40:13.144364] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:58.591 [2024-07-26 05:40:13.144390] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:58.591 [2024-07-26 05:40:13.221609] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:58.591 [2024-07-26 05:40:13.221705] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:58.591 [2024-07-26 05:40:13.221726] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1128670 00:11:58.591 [2024-07-26 05:40:13.221739] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:58.591 [2024-07-26 05:40:13.223340] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:58.591 [2024-07-26 05:40:13.223371] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:58.591 Running I/O for 1 seconds... 00:11:59.966 00:11:59.966 Latency(us) 00:11:59.966 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:59.967 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc0 : 1.03 4957.02 19.36 0.00 0.00 25810.67 683.85 43310.75 00:11:59.967 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc1p0 : 1.03 4949.83 19.34 0.00 0.00 25802.36 933.18 42398.94 00:11:59.967 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc1p1 : 1.04 4942.67 19.31 0.00 0.00 25778.99 911.81 41487.14 00:11:59.967 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc2p0 : 1.04 4935.50 19.28 0.00 0.00 25759.38 901.12 40575.33 00:11:59.967 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc2p1 : 1.04 4928.35 19.25 0.00 0.00 25736.79 911.81 39663.53 00:11:59.967 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc2p2 : 1.04 4921.27 19.22 0.00 0.00 25715.48 911.81 38751.72 00:11:59.967 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc2p3 : 1.04 4914.14 19.20 0.00 0.00 25695.87 904.68 37839.92 00:11:59.967 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc2p4 : 1.04 4907.11 19.17 0.00 0.00 25675.18 911.81 36928.11 00:11:59.967 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc2p5 : 1.06 4949.77 19.34 0.00 0.00 25408.59 911.81 36244.26 00:11:59.967 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc2p6 : 1.06 4942.75 19.31 0.00 0.00 25387.29 904.68 35332.45 00:11:59.967 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 Malloc2p7 : 1.06 4935.78 19.28 0.00 0.00 25365.88 911.81 34420.65 00:11:59.967 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 TestPT : 1.06 4928.88 19.25 0.00 0.00 25341.56 954.55 33508.84 00:11:59.967 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 raid0 : 1.07 4920.90 19.22 0.00 0.00 25313.06 1631.28 31685.23 00:11:59.967 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 concat0 : 1.07 4913.08 19.19 0.00 0.00 25257.63 1624.15 30089.57 00:11:59.967 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 raid1 : 1.07 4903.28 19.15 0.00 0.00 25197.94 2578.70 27468.13 00:11:59.967 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:59.967 AIO0 : 1.07 4897.33 19.13 0.00 0.00 25105.79 1047.15 27126.21 00:11:59.967 =================================================================================================================== 00:11:59.967 Total : 78847.65 308.00 0.00 0.00 25519.25 683.85 43310.75 00:12:00.225 00:12:00.225 real 0m2.268s 00:12:00.225 user 0m1.823s 00:12:00.225 sys 0m0.362s 00:12:00.225 05:40:14 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:00.225 05:40:14 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:12:00.225 ************************************ 00:12:00.225 END TEST bdev_write_zeroes 00:12:00.225 ************************************ 00:12:00.225 05:40:14 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:12:00.225 05:40:14 blockdev_general -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:00.225 05:40:14 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:12:00.225 05:40:14 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:00.225 05:40:14 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:00.225 ************************************ 00:12:00.225 START TEST bdev_json_nonenclosed 00:12:00.225 ************************************ 00:12:00.225 05:40:15 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:00.225 [2024-07-26 05:40:15.087447] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:00.225 [2024-07-26 05:40:15.087507] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1118408 ] 00:12:00.483 [2024-07-26 05:40:15.214416] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:00.483 [2024-07-26 05:40:15.314981] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:00.483 [2024-07-26 05:40:15.315052] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:12:00.483 [2024-07-26 05:40:15.315069] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:12:00.483 [2024-07-26 05:40:15.315082] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:12:00.741 00:12:00.741 real 0m0.387s 00:12:00.741 user 0m0.228s 00:12:00.741 sys 0m0.156s 00:12:00.741 05:40:15 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:12:00.741 05:40:15 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:00.741 05:40:15 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:12:00.741 ************************************ 00:12:00.741 END TEST bdev_json_nonenclosed 00:12:00.741 ************************************ 00:12:00.741 05:40:15 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:12:00.741 05:40:15 blockdev_general -- bdev/blockdev.sh@781 -- # true 00:12:00.741 05:40:15 blockdev_general -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:00.741 05:40:15 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:12:00.741 05:40:15 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:00.741 05:40:15 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:00.741 ************************************ 00:12:00.741 START TEST bdev_json_nonarray 00:12:00.741 ************************************ 00:12:00.741 05:40:15 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:12:00.741 [2024-07-26 05:40:15.526041] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:00.741 [2024-07-26 05:40:15.526086] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1118442 ] 00:12:00.741 [2024-07-26 05:40:15.635494] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:01.000 [2024-07-26 05:40:15.735064] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:01.000 [2024-07-26 05:40:15.735144] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:12:01.000 [2024-07-26 05:40:15.735162] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:12:01.000 [2024-07-26 05:40:15.735174] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:12:01.000 00:12:01.000 real 0m0.348s 00:12:01.000 user 0m0.218s 00:12:01.000 sys 0m0.129s 00:12:01.000 05:40:15 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:12:01.000 05:40:15 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:01.000 05:40:15 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:12:01.000 ************************************ 00:12:01.000 END TEST bdev_json_nonarray 00:12:01.000 ************************************ 00:12:01.000 05:40:15 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:12:01.000 05:40:15 blockdev_general -- bdev/blockdev.sh@784 -- # true 00:12:01.000 05:40:15 blockdev_general -- bdev/blockdev.sh@786 -- # [[ bdev == bdev ]] 00:12:01.000 05:40:15 blockdev_general -- bdev/blockdev.sh@787 -- # run_test bdev_qos qos_test_suite '' 00:12:01.000 05:40:15 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:12:01.000 05:40:15 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:01.000 05:40:15 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:01.258 ************************************ 00:12:01.258 START TEST bdev_qos 00:12:01.258 ************************************ 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # QOS_PID=1118495 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # echo 'Process qos testing pid: 1118495' 00:12:01.258 Process qos testing pid: 1118495 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # waitforlisten 1118495 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 1118495 ']' 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:01.258 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- bdev/blockdev.sh@444 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:01.258 05:40:15 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:01.258 [2024-07-26 05:40:15.978291] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:01.258 [2024-07-26 05:40:15.978356] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1118495 ] 00:12:01.258 [2024-07-26 05:40:16.099016] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:01.516 [2024-07-26 05:40:16.205391] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@450 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:02.083 Malloc_0 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # waitforbdev Malloc_0 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:02.083 [ 00:12:02.083 { 00:12:02.083 "name": "Malloc_0", 00:12:02.083 "aliases": [ 00:12:02.083 "3d3f453c-12ab-4a2c-a5e4-df87abf0746b" 00:12:02.083 ], 00:12:02.083 "product_name": "Malloc disk", 00:12:02.083 "block_size": 512, 00:12:02.083 "num_blocks": 262144, 00:12:02.083 "uuid": "3d3f453c-12ab-4a2c-a5e4-df87abf0746b", 00:12:02.083 "assigned_rate_limits": { 00:12:02.083 "rw_ios_per_sec": 0, 00:12:02.083 "rw_mbytes_per_sec": 0, 00:12:02.083 "r_mbytes_per_sec": 0, 00:12:02.083 "w_mbytes_per_sec": 0 00:12:02.083 }, 00:12:02.083 "claimed": false, 00:12:02.083 "zoned": false, 00:12:02.083 "supported_io_types": { 00:12:02.083 "read": true, 00:12:02.083 "write": true, 00:12:02.083 "unmap": true, 00:12:02.083 "flush": true, 00:12:02.083 "reset": true, 00:12:02.083 "nvme_admin": false, 00:12:02.083 "nvme_io": false, 00:12:02.083 "nvme_io_md": false, 00:12:02.083 "write_zeroes": true, 00:12:02.083 "zcopy": true, 00:12:02.083 "get_zone_info": false, 00:12:02.083 "zone_management": false, 00:12:02.083 "zone_append": false, 00:12:02.083 "compare": false, 00:12:02.083 "compare_and_write": false, 00:12:02.083 "abort": true, 00:12:02.083 "seek_hole": false, 00:12:02.083 "seek_data": false, 00:12:02.083 "copy": true, 00:12:02.083 "nvme_iov_md": false 00:12:02.083 }, 00:12:02.083 "memory_domains": [ 00:12:02.083 { 00:12:02.083 "dma_device_id": "system", 00:12:02.083 "dma_device_type": 1 00:12:02.083 }, 00:12:02.083 { 00:12:02.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.083 "dma_device_type": 2 00:12:02.083 } 00:12:02.083 ], 00:12:02.083 "driver_specific": {} 00:12:02.083 } 00:12:02.083 ] 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # rpc_cmd bdev_null_create Null_1 128 512 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:02.083 Null_1 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # waitforbdev Null_1 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.083 05:40:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:02.341 [ 00:12:02.341 { 00:12:02.341 "name": "Null_1", 00:12:02.341 "aliases": [ 00:12:02.341 "e6904c05-73fb-44ca-9933-a4fde16830bf" 00:12:02.341 ], 00:12:02.341 "product_name": "Null disk", 00:12:02.341 "block_size": 512, 00:12:02.341 "num_blocks": 262144, 00:12:02.341 "uuid": "e6904c05-73fb-44ca-9933-a4fde16830bf", 00:12:02.341 "assigned_rate_limits": { 00:12:02.341 "rw_ios_per_sec": 0, 00:12:02.341 "rw_mbytes_per_sec": 0, 00:12:02.341 "r_mbytes_per_sec": 0, 00:12:02.341 "w_mbytes_per_sec": 0 00:12:02.341 }, 00:12:02.341 "claimed": false, 00:12:02.341 "zoned": false, 00:12:02.341 "supported_io_types": { 00:12:02.341 "read": true, 00:12:02.341 "write": true, 00:12:02.341 "unmap": false, 00:12:02.341 "flush": false, 00:12:02.341 "reset": true, 00:12:02.341 "nvme_admin": false, 00:12:02.341 "nvme_io": false, 00:12:02.341 "nvme_io_md": false, 00:12:02.341 "write_zeroes": true, 00:12:02.341 "zcopy": false, 00:12:02.341 "get_zone_info": false, 00:12:02.341 "zone_management": false, 00:12:02.341 "zone_append": false, 00:12:02.341 "compare": false, 00:12:02.341 "compare_and_write": false, 00:12:02.341 "abort": true, 00:12:02.341 "seek_hole": false, 00:12:02.341 "seek_data": false, 00:12:02.341 "copy": false, 00:12:02.341 "nvme_iov_md": false 00:12:02.341 }, 00:12:02.341 "driver_specific": {} 00:12:02.341 } 00:12:02.341 ] 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # qos_function_test 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@409 -- # local qos_lower_iops_limit=1000 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@455 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_bw_limit=2 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local io_result=0 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local iops_limit=0 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local bw_limit=0 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # get_io_result IOPS Malloc_0 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:12:02.341 05:40:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:12:02.341 Running I/O for 60 seconds... 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 62715.34 250861.37 0.00 0.00 251904.00 0.00 0.00 ' 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # iostat_result=62715.34 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 62715 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # io_result=62715 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@417 -- # iops_limit=15000 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # '[' 15000 -gt 1000 ']' 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@421 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 15000 Malloc_0 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # run_test bdev_qos_iops run_qos_test 15000 IOPS Malloc_0 00:12:07.651 05:40:22 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:07.652 05:40:22 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:07.652 05:40:22 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:07.652 ************************************ 00:12:07.652 START TEST bdev_qos_iops 00:12:07.652 ************************************ 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 15000 IOPS Malloc_0 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@388 -- # local qos_limit=15000 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_result=0 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # get_io_result IOPS Malloc_0 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local iostat_result 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:12:07.652 05:40:22 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # tail -1 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 15001.72 60006.88 0.00 0.00 60900.00 0.00 0.00 ' 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # iostat_result=15001.72 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@384 -- # echo 15001 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # qos_result=15001 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # '[' IOPS = BANDWIDTH ']' 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@395 -- # lower_limit=13500 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # upper_limit=16500 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 15001 -lt 13500 ']' 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 15001 -gt 16500 ']' 00:12:12.911 00:12:12.911 real 0m5.236s 00:12:12.911 user 0m0.119s 00:12:12.911 sys 0m0.043s 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:12.911 05:40:27 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:12:12.911 ************************************ 00:12:12.911 END TEST bdev_qos_iops 00:12:12.911 ************************************ 00:12:12.911 05:40:27 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:12:12.911 05:40:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # get_io_result BANDWIDTH Null_1 00:12:12.911 05:40:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:12:12.911 05:40:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:12:12.911 05:40:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:12:12.911 05:40:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:12:12.911 05:40:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Null_1 00:12:12.911 05:40:27 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 20408.56 81634.24 0.00 0.00 82944.00 0.00 0.00 ' 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # iostat_result=82944.00 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 82944 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # bw_limit=82944 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=8 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # '[' 8 -lt 2 ']' 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@431 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:18.174 05:40:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:18.174 ************************************ 00:12:18.174 START TEST bdev_qos_bw 00:12:18.174 ************************************ 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@388 -- # local qos_limit=8 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Null_1 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # grep Null_1 00:12:18.174 05:40:32 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # tail -1 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 2048.77 8195.10 0.00 0.00 8332.00 0.00 0.00 ' 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # iostat_result=8332.00 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@384 -- # echo 8332 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # qos_result=8332 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # qos_limit=8192 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@395 -- # lower_limit=7372 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # upper_limit=9011 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 8332 -lt 7372 ']' 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 8332 -gt 9011 ']' 00:12:23.439 00:12:23.439 real 0m5.359s 00:12:23.439 user 0m0.206s 00:12:23.439 sys 0m0.068s 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:12:23.439 ************************************ 00:12:23.439 END TEST bdev_qos_bw 00:12:23.439 ************************************ 00:12:23.439 05:40:38 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:12:23.439 05:40:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@435 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:12:23.439 05:40:38 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:23.439 05:40:38 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:23.439 05:40:38 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:23.439 05:40:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:12:23.439 05:40:38 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:23.439 05:40:38 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:23.439 05:40:38 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:23.439 ************************************ 00:12:23.439 START TEST bdev_qos_ro_bw 00:12:23.439 ************************************ 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@388 -- # local qos_limit=2 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Malloc_0 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:12:23.439 05:40:38 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # tail -1 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 511.69 2046.75 0.00 0.00 2060.00 0.00 0.00 ' 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # iostat_result=2060.00 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@384 -- # echo 2060 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # qos_result=2060 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # qos_limit=2048 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@395 -- # lower_limit=1843 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # upper_limit=2252 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -lt 1843 ']' 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2060 -gt 2252 ']' 00:12:28.708 00:12:28.708 real 0m5.193s 00:12:28.708 user 0m0.107s 00:12:28.708 sys 0m0.056s 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:28.708 05:40:43 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:12:28.708 ************************************ 00:12:28.708 END TEST bdev_qos_ro_bw 00:12:28.708 ************************************ 00:12:28.708 05:40:43 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:12:28.708 05:40:43 blockdev_general.bdev_qos -- bdev/blockdev.sh@458 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:12:28.708 05:40:43 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:28.708 05:40:43 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:29.275 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:29.275 05:40:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_null_delete Null_1 00:12:29.275 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:29.275 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:29.533 00:12:29.533 Latency(us) 00:12:29.533 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:29.533 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:12:29.533 Malloc_0 : 26.84 20874.45 81.54 0.00 0.00 12147.82 2008.82 503316.48 00:12:29.533 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:12:29.533 Null_1 : 26.99 20602.29 80.48 0.00 0.00 12391.75 776.46 149536.06 00:12:29.533 =================================================================================================================== 00:12:29.533 Total : 41476.74 162.02 0.00 0.00 12269.32 776.46 503316.48 00:12:29.533 0 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # killprocess 1118495 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 1118495 ']' 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 1118495 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1118495 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1118495' 00:12:29.534 killing process with pid 1118495 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 1118495 00:12:29.534 Received shutdown signal, test time was about 27.050207 seconds 00:12:29.534 00:12:29.534 Latency(us) 00:12:29.534 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:29.534 =================================================================================================================== 00:12:29.534 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:12:29.534 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 1118495 00:12:29.793 05:40:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # trap - SIGINT SIGTERM EXIT 00:12:29.793 00:12:29.793 real 0m28.542s 00:12:29.793 user 0m29.350s 00:12:29.793 sys 0m0.886s 00:12:29.793 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:29.793 05:40:44 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:12:29.793 ************************************ 00:12:29.793 END TEST bdev_qos 00:12:29.793 ************************************ 00:12:29.793 05:40:44 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:12:29.793 05:40:44 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:12:29.793 05:40:44 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:12:29.793 05:40:44 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:29.793 05:40:44 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:29.793 ************************************ 00:12:29.793 START TEST bdev_qd_sampling 00:12:29.793 ************************************ 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@537 -- # QD_DEV=Malloc_QD 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # QD_PID=1122304 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # echo 'Process bdev QD sampling period testing pid: 1122304' 00:12:29.793 Process bdev QD sampling period testing pid: 1122304 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # waitforlisten 1122304 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 1122304 ']' 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:29.793 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:29.793 05:40:44 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:12:29.793 [2024-07-26 05:40:44.605475] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:29.793 [2024-07-26 05:40:44.605540] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1122304 ] 00:12:30.052 [2024-07-26 05:40:44.725228] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:12:30.052 [2024-07-26 05:40:44.829366] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:30.052 [2024-07-26 05:40:44.829372] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@545 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:12:30.987 Malloc_QD 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # waitforbdev Malloc_QD 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:12:30.987 [ 00:12:30.987 { 00:12:30.987 "name": "Malloc_QD", 00:12:30.987 "aliases": [ 00:12:30.987 "d1534929-92b6-4285-bc53-715f6d3ec86f" 00:12:30.987 ], 00:12:30.987 "product_name": "Malloc disk", 00:12:30.987 "block_size": 512, 00:12:30.987 "num_blocks": 262144, 00:12:30.987 "uuid": "d1534929-92b6-4285-bc53-715f6d3ec86f", 00:12:30.987 "assigned_rate_limits": { 00:12:30.987 "rw_ios_per_sec": 0, 00:12:30.987 "rw_mbytes_per_sec": 0, 00:12:30.987 "r_mbytes_per_sec": 0, 00:12:30.987 "w_mbytes_per_sec": 0 00:12:30.987 }, 00:12:30.987 "claimed": false, 00:12:30.987 "zoned": false, 00:12:30.987 "supported_io_types": { 00:12:30.987 "read": true, 00:12:30.987 "write": true, 00:12:30.987 "unmap": true, 00:12:30.987 "flush": true, 00:12:30.987 "reset": true, 00:12:30.987 "nvme_admin": false, 00:12:30.987 "nvme_io": false, 00:12:30.987 "nvme_io_md": false, 00:12:30.987 "write_zeroes": true, 00:12:30.987 "zcopy": true, 00:12:30.987 "get_zone_info": false, 00:12:30.987 "zone_management": false, 00:12:30.987 "zone_append": false, 00:12:30.987 "compare": false, 00:12:30.987 "compare_and_write": false, 00:12:30.987 "abort": true, 00:12:30.987 "seek_hole": false, 00:12:30.987 "seek_data": false, 00:12:30.987 "copy": true, 00:12:30.987 "nvme_iov_md": false 00:12:30.987 }, 00:12:30.987 "memory_domains": [ 00:12:30.987 { 00:12:30.987 "dma_device_id": "system", 00:12:30.987 "dma_device_type": 1 00:12:30.987 }, 00:12:30.987 { 00:12:30.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.987 "dma_device_type": 2 00:12:30.987 } 00:12:30.987 ], 00:12:30.987 "driver_specific": {} 00:12:30.987 } 00:12:30.987 ] 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # sleep 2 00:12:30.987 05:40:45 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:12:30.987 Running I/O for 5 seconds... 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # qd_sampling_function_test Malloc_QD 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@518 -- # local bdev_name=Malloc_QD 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local sampling_period=10 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local iostats 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@522 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # iostats='{ 00:12:32.891 "tick_rate": 2300000000, 00:12:32.891 "ticks": 7475800281816674, 00:12:32.891 "bdevs": [ 00:12:32.891 { 00:12:32.891 "name": "Malloc_QD", 00:12:32.891 "bytes_read": 762360320, 00:12:32.891 "num_read_ops": 186116, 00:12:32.891 "bytes_written": 0, 00:12:32.891 "num_write_ops": 0, 00:12:32.891 "bytes_unmapped": 0, 00:12:32.891 "num_unmap_ops": 0, 00:12:32.891 "bytes_copied": 0, 00:12:32.891 "num_copy_ops": 0, 00:12:32.891 "read_latency_ticks": 2247319598200, 00:12:32.891 "max_read_latency_ticks": 14528590, 00:12:32.891 "min_read_latency_ticks": 251944, 00:12:32.891 "write_latency_ticks": 0, 00:12:32.891 "max_write_latency_ticks": 0, 00:12:32.891 "min_write_latency_ticks": 0, 00:12:32.891 "unmap_latency_ticks": 0, 00:12:32.891 "max_unmap_latency_ticks": 0, 00:12:32.891 "min_unmap_latency_ticks": 0, 00:12:32.891 "copy_latency_ticks": 0, 00:12:32.891 "max_copy_latency_ticks": 0, 00:12:32.891 "min_copy_latency_ticks": 0, 00:12:32.891 "io_error": {}, 00:12:32.891 "queue_depth_polling_period": 10, 00:12:32.891 "queue_depth": 512, 00:12:32.891 "io_time": 30, 00:12:32.891 "weighted_io_time": 15360 00:12:32.891 } 00:12:32.891 ] 00:12:32.891 }' 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # qd_sampling_period=10 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 == null ']' 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 -ne 10 ']' 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@552 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:12:32.891 00:12:32.891 Latency(us) 00:12:32.891 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:32.891 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:12:32.891 Malloc_QD : 1.98 48510.57 189.49 0.00 0.00 5264.06 1438.94 5556.31 00:12:32.891 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:12:32.891 Malloc_QD : 1.98 48870.04 190.90 0.00 0.00 5225.94 940.30 6325.65 00:12:32.891 =================================================================================================================== 00:12:32.891 Total : 97380.61 380.39 0.00 0.00 5244.92 940.30 6325.65 00:12:32.891 0 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # killprocess 1122304 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 1122304 ']' 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 1122304 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1122304 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1122304' 00:12:32.891 killing process with pid 1122304 00:12:32.891 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 1122304 00:12:32.891 Received shutdown signal, test time was about 2.060478 seconds 00:12:32.891 00:12:32.891 Latency(us) 00:12:32.891 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:32.891 =================================================================================================================== 00:12:32.891 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:12:33.150 05:40:47 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 1122304 00:12:33.150 05:40:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # trap - SIGINT SIGTERM EXIT 00:12:33.150 00:12:33.150 real 0m3.483s 00:12:33.150 user 0m6.852s 00:12:33.150 sys 0m0.424s 00:12:33.150 05:40:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:33.150 05:40:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:12:33.150 ************************************ 00:12:33.150 END TEST bdev_qd_sampling 00:12:33.150 ************************************ 00:12:33.409 05:40:48 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:12:33.409 05:40:48 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_error error_test_suite '' 00:12:33.409 05:40:48 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:12:33.409 05:40:48 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:33.409 05:40:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:33.409 ************************************ 00:12:33.409 START TEST bdev_error 00:12:33.409 ************************************ 00:12:33.409 05:40:48 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:12:33.409 05:40:48 blockdev_general.bdev_error -- bdev/blockdev.sh@465 -- # DEV_1=Dev_1 00:12:33.409 05:40:48 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_2=Dev_2 00:12:33.409 05:40:48 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # ERR_DEV=EE_Dev_1 00:12:33.409 05:40:48 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # ERR_PID=1122808 00:12:33.409 05:40:48 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # echo 'Process error testing pid: 1122808' 00:12:33.409 Process error testing pid: 1122808 00:12:33.409 05:40:48 blockdev_general.bdev_error -- bdev/blockdev.sh@470 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:12:33.409 05:40:48 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # waitforlisten 1122808 00:12:33.409 05:40:48 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1122808 ']' 00:12:33.409 05:40:48 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:33.409 05:40:48 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:33.409 05:40:48 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:33.409 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:33.409 05:40:48 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:33.409 05:40:48 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:33.409 [2024-07-26 05:40:48.182262] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:33.410 [2024-07-26 05:40:48.182333] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1122808 ] 00:12:33.410 [2024-07-26 05:40:48.304255] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:33.670 [2024-07-26 05:40:48.403328] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:12:34.298 05:40:49 blockdev_general.bdev_error -- bdev/blockdev.sh@475 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:34.298 Dev_1 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:34.298 05:40:49 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # waitforbdev Dev_1 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:34.298 [ 00:12:34.298 { 00:12:34.298 "name": "Dev_1", 00:12:34.298 "aliases": [ 00:12:34.298 "7bd2f982-9663-4dc4-847d-21a357e83e74" 00:12:34.298 ], 00:12:34.298 "product_name": "Malloc disk", 00:12:34.298 "block_size": 512, 00:12:34.298 "num_blocks": 262144, 00:12:34.298 "uuid": "7bd2f982-9663-4dc4-847d-21a357e83e74", 00:12:34.298 "assigned_rate_limits": { 00:12:34.298 "rw_ios_per_sec": 0, 00:12:34.298 "rw_mbytes_per_sec": 0, 00:12:34.298 "r_mbytes_per_sec": 0, 00:12:34.298 "w_mbytes_per_sec": 0 00:12:34.298 }, 00:12:34.298 "claimed": false, 00:12:34.298 "zoned": false, 00:12:34.298 "supported_io_types": { 00:12:34.298 "read": true, 00:12:34.298 "write": true, 00:12:34.298 "unmap": true, 00:12:34.298 "flush": true, 00:12:34.298 "reset": true, 00:12:34.298 "nvme_admin": false, 00:12:34.298 "nvme_io": false, 00:12:34.298 "nvme_io_md": false, 00:12:34.298 "write_zeroes": true, 00:12:34.298 "zcopy": true, 00:12:34.298 "get_zone_info": false, 00:12:34.298 "zone_management": false, 00:12:34.298 "zone_append": false, 00:12:34.298 "compare": false, 00:12:34.298 "compare_and_write": false, 00:12:34.298 "abort": true, 00:12:34.298 "seek_hole": false, 00:12:34.298 "seek_data": false, 00:12:34.298 "copy": true, 00:12:34.298 "nvme_iov_md": false 00:12:34.298 }, 00:12:34.298 "memory_domains": [ 00:12:34.298 { 00:12:34.298 "dma_device_id": "system", 00:12:34.298 "dma_device_type": 1 00:12:34.298 }, 00:12:34.298 { 00:12:34.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.298 "dma_device_type": 2 00:12:34.298 } 00:12:34.298 ], 00:12:34.298 "driver_specific": {} 00:12:34.298 } 00:12:34.298 ] 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:12:34.298 05:40:49 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # rpc_cmd bdev_error_create Dev_1 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:34.298 true 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:34.298 05:40:49 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:34.298 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:34.557 Dev_2 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:34.557 05:40:49 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # waitforbdev Dev_2 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:34.557 [ 00:12:34.557 { 00:12:34.557 "name": "Dev_2", 00:12:34.557 "aliases": [ 00:12:34.557 "4237fbdd-e4a8-43b1-9262-05dfc2aaf7b4" 00:12:34.557 ], 00:12:34.557 "product_name": "Malloc disk", 00:12:34.557 "block_size": 512, 00:12:34.557 "num_blocks": 262144, 00:12:34.557 "uuid": "4237fbdd-e4a8-43b1-9262-05dfc2aaf7b4", 00:12:34.557 "assigned_rate_limits": { 00:12:34.557 "rw_ios_per_sec": 0, 00:12:34.557 "rw_mbytes_per_sec": 0, 00:12:34.557 "r_mbytes_per_sec": 0, 00:12:34.557 "w_mbytes_per_sec": 0 00:12:34.557 }, 00:12:34.557 "claimed": false, 00:12:34.557 "zoned": false, 00:12:34.557 "supported_io_types": { 00:12:34.557 "read": true, 00:12:34.557 "write": true, 00:12:34.557 "unmap": true, 00:12:34.557 "flush": true, 00:12:34.557 "reset": true, 00:12:34.557 "nvme_admin": false, 00:12:34.557 "nvme_io": false, 00:12:34.557 "nvme_io_md": false, 00:12:34.557 "write_zeroes": true, 00:12:34.557 "zcopy": true, 00:12:34.557 "get_zone_info": false, 00:12:34.557 "zone_management": false, 00:12:34.557 "zone_append": false, 00:12:34.557 "compare": false, 00:12:34.557 "compare_and_write": false, 00:12:34.557 "abort": true, 00:12:34.557 "seek_hole": false, 00:12:34.557 "seek_data": false, 00:12:34.557 "copy": true, 00:12:34.557 "nvme_iov_md": false 00:12:34.557 }, 00:12:34.557 "memory_domains": [ 00:12:34.557 { 00:12:34.557 "dma_device_id": "system", 00:12:34.557 "dma_device_type": 1 00:12:34.557 }, 00:12:34.557 { 00:12:34.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.557 "dma_device_type": 2 00:12:34.557 } 00:12:34.557 ], 00:12:34.557 "driver_specific": {} 00:12:34.557 } 00:12:34.557 ] 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:12:34.557 05:40:49 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:34.557 05:40:49 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:34.557 05:40:49 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # sleep 1 00:12:34.557 05:40:49 blockdev_general.bdev_error -- bdev/blockdev.sh@482 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:12:34.557 Running I/O for 5 seconds... 00:12:35.494 05:40:50 blockdev_general.bdev_error -- bdev/blockdev.sh@486 -- # kill -0 1122808 00:12:35.494 05:40:50 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # echo 'Process is existed as continue on error is set. Pid: 1122808' 00:12:35.494 Process is existed as continue on error is set. Pid: 1122808 00:12:35.494 05:40:50 blockdev_general.bdev_error -- bdev/blockdev.sh@494 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:12:35.494 05:40:50 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:35.494 05:40:50 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:35.494 05:40:50 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:35.494 05:40:50 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_malloc_delete Dev_1 00:12:35.494 05:40:50 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:35.494 05:40:50 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:35.494 05:40:50 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:35.494 05:40:50 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # sleep 5 00:12:35.494 Timeout while waiting for response: 00:12:35.494 00:12:35.494 00:12:39.684 00:12:39.684 Latency(us) 00:12:39.684 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:39.684 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:12:39.684 EE_Dev_1 : 0.89 37071.37 144.81 5.60 0.00 427.99 133.57 961.67 00:12:39.684 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:12:39.684 Dev_2 : 5.00 80602.18 314.85 0.00 0.00 195.01 67.67 19261.89 00:12:39.684 =================================================================================================================== 00:12:39.684 Total : 117673.55 459.66 5.60 0.00 212.70 67.67 19261.89 00:12:40.620 05:40:55 blockdev_general.bdev_error -- bdev/blockdev.sh@498 -- # killprocess 1122808 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 1122808 ']' 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 1122808 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1122808 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1122808' 00:12:40.620 killing process with pid 1122808 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 1122808 00:12:40.620 Received shutdown signal, test time was about 5.000000 seconds 00:12:40.620 00:12:40.620 Latency(us) 00:12:40.620 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:40.620 =================================================================================================================== 00:12:40.620 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:12:40.620 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 1122808 00:12:40.878 05:40:55 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # ERR_PID=1123834 00:12:40.878 05:40:55 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # echo 'Process error testing pid: 1123834' 00:12:40.878 Process error testing pid: 1123834 00:12:40.878 05:40:55 blockdev_general.bdev_error -- bdev/blockdev.sh@501 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:12:40.878 05:40:55 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # waitforlisten 1123834 00:12:40.878 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1123834 ']' 00:12:40.879 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:40.879 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:40.879 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:40.879 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:40.879 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:40.879 05:40:55 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:40.879 [2024-07-26 05:40:55.697587] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:40.879 [2024-07-26 05:40:55.697672] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1123834 ] 00:12:41.137 [2024-07-26 05:40:55.817053] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:41.137 [2024-07-26 05:40:55.917539] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:12:42.073 05:40:56 blockdev_general.bdev_error -- bdev/blockdev.sh@506 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:42.073 Dev_1 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:42.073 05:40:56 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # waitforbdev Dev_1 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:42.073 [ 00:12:42.073 { 00:12:42.073 "name": "Dev_1", 00:12:42.073 "aliases": [ 00:12:42.073 "db20439b-90bf-47e7-aa18-13e495c2938b" 00:12:42.073 ], 00:12:42.073 "product_name": "Malloc disk", 00:12:42.073 "block_size": 512, 00:12:42.073 "num_blocks": 262144, 00:12:42.073 "uuid": "db20439b-90bf-47e7-aa18-13e495c2938b", 00:12:42.073 "assigned_rate_limits": { 00:12:42.073 "rw_ios_per_sec": 0, 00:12:42.073 "rw_mbytes_per_sec": 0, 00:12:42.073 "r_mbytes_per_sec": 0, 00:12:42.073 "w_mbytes_per_sec": 0 00:12:42.073 }, 00:12:42.073 "claimed": false, 00:12:42.073 "zoned": false, 00:12:42.073 "supported_io_types": { 00:12:42.073 "read": true, 00:12:42.073 "write": true, 00:12:42.073 "unmap": true, 00:12:42.073 "flush": true, 00:12:42.073 "reset": true, 00:12:42.073 "nvme_admin": false, 00:12:42.073 "nvme_io": false, 00:12:42.073 "nvme_io_md": false, 00:12:42.073 "write_zeroes": true, 00:12:42.073 "zcopy": true, 00:12:42.073 "get_zone_info": false, 00:12:42.073 "zone_management": false, 00:12:42.073 "zone_append": false, 00:12:42.073 "compare": false, 00:12:42.073 "compare_and_write": false, 00:12:42.073 "abort": true, 00:12:42.073 "seek_hole": false, 00:12:42.073 "seek_data": false, 00:12:42.073 "copy": true, 00:12:42.073 "nvme_iov_md": false 00:12:42.073 }, 00:12:42.073 "memory_domains": [ 00:12:42.073 { 00:12:42.073 "dma_device_id": "system", 00:12:42.073 "dma_device_type": 1 00:12:42.073 }, 00:12:42.073 { 00:12:42.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.073 "dma_device_type": 2 00:12:42.073 } 00:12:42.073 ], 00:12:42.073 "driver_specific": {} 00:12:42.073 } 00:12:42.073 ] 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:12:42.073 05:40:56 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # rpc_cmd bdev_error_create Dev_1 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:42.073 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:42.074 true 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:42.074 05:40:56 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:42.074 Dev_2 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:42.074 05:40:56 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # waitforbdev Dev_2 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:42.074 [ 00:12:42.074 { 00:12:42.074 "name": "Dev_2", 00:12:42.074 "aliases": [ 00:12:42.074 "22006d2f-3e4c-4297-8cb4-1f11fb827c29" 00:12:42.074 ], 00:12:42.074 "product_name": "Malloc disk", 00:12:42.074 "block_size": 512, 00:12:42.074 "num_blocks": 262144, 00:12:42.074 "uuid": "22006d2f-3e4c-4297-8cb4-1f11fb827c29", 00:12:42.074 "assigned_rate_limits": { 00:12:42.074 "rw_ios_per_sec": 0, 00:12:42.074 "rw_mbytes_per_sec": 0, 00:12:42.074 "r_mbytes_per_sec": 0, 00:12:42.074 "w_mbytes_per_sec": 0 00:12:42.074 }, 00:12:42.074 "claimed": false, 00:12:42.074 "zoned": false, 00:12:42.074 "supported_io_types": { 00:12:42.074 "read": true, 00:12:42.074 "write": true, 00:12:42.074 "unmap": true, 00:12:42.074 "flush": true, 00:12:42.074 "reset": true, 00:12:42.074 "nvme_admin": false, 00:12:42.074 "nvme_io": false, 00:12:42.074 "nvme_io_md": false, 00:12:42.074 "write_zeroes": true, 00:12:42.074 "zcopy": true, 00:12:42.074 "get_zone_info": false, 00:12:42.074 "zone_management": false, 00:12:42.074 "zone_append": false, 00:12:42.074 "compare": false, 00:12:42.074 "compare_and_write": false, 00:12:42.074 "abort": true, 00:12:42.074 "seek_hole": false, 00:12:42.074 "seek_data": false, 00:12:42.074 "copy": true, 00:12:42.074 "nvme_iov_md": false 00:12:42.074 }, 00:12:42.074 "memory_domains": [ 00:12:42.074 { 00:12:42.074 "dma_device_id": "system", 00:12:42.074 "dma_device_type": 1 00:12:42.074 }, 00:12:42.074 { 00:12:42.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.074 "dma_device_type": 2 00:12:42.074 } 00:12:42.074 ], 00:12:42.074 "driver_specific": {} 00:12:42.074 } 00:12:42.074 ] 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:12:42.074 05:40:56 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:42.074 05:40:56 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # NOT wait 1123834 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 1123834 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:12:42.074 05:40:56 blockdev_general.bdev_error -- bdev/blockdev.sh@513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:42.074 05:40:56 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 1123834 00:12:42.074 Running I/O for 5 seconds... 00:12:42.074 task offset: 40728 on job bdev=EE_Dev_1 fails 00:12:42.074 00:12:42.074 Latency(us) 00:12:42.074 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:42.074 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:12:42.074 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:12:42.074 EE_Dev_1 : 0.00 29062.09 113.52 6605.02 0.00 366.58 133.57 662.48 00:12:42.074 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:12:42.074 Dev_2 : 0.00 18079.10 70.62 0.00 0.00 651.69 127.33 1210.99 00:12:42.074 =================================================================================================================== 00:12:42.074 Total : 47141.18 184.15 6605.02 0.00 521.21 127.33 1210.99 00:12:42.074 [2024-07-26 05:40:56.909356] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:12:42.074 request: 00:12:42.074 { 00:12:42.074 "method": "perform_tests", 00:12:42.074 "req_id": 1 00:12:42.074 } 00:12:42.074 Got JSON-RPC error response 00:12:42.074 response: 00:12:42.074 { 00:12:42.074 "code": -32603, 00:12:42.074 "message": "bdevperf failed with error Operation not permitted" 00:12:42.074 } 00:12:42.333 05:40:57 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:12:42.333 05:40:57 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:42.333 05:40:57 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:12:42.333 05:40:57 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:12:42.333 05:40:57 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:12:42.333 05:40:57 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:42.333 00:12:42.333 real 0m9.067s 00:12:42.333 user 0m9.505s 00:12:42.333 sys 0m0.856s 00:12:42.333 05:40:57 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:42.333 05:40:57 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:12:42.333 ************************************ 00:12:42.333 END TEST bdev_error 00:12:42.333 ************************************ 00:12:42.333 05:40:57 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:12:42.333 05:40:57 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_stat stat_test_suite '' 00:12:42.333 05:40:57 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:12:42.333 05:40:57 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:42.333 05:40:57 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:42.592 ************************************ 00:12:42.592 START TEST bdev_stat 00:12:42.592 ************************************ 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- bdev/blockdev.sh@591 -- # STAT_DEV=Malloc_STAT 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # STAT_PID=1124065 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # echo 'Process Bdev IO statistics testing pid: 1124065' 00:12:42.592 Process Bdev IO statistics testing pid: 1124065 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- bdev/blockdev.sh@594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # waitforlisten 1124065 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 1124065 ']' 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:42.592 05:40:57 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:42.593 05:40:57 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:42.593 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:42.593 05:40:57 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:42.593 05:40:57 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:42.593 [2024-07-26 05:40:57.318064] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:42.593 [2024-07-26 05:40:57.318128] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1124065 ] 00:12:42.593 [2024-07-26 05:40:57.446418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:12:42.852 [2024-07-26 05:40:57.555711] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:12:42.852 [2024-07-26 05:40:57.555719] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- bdev/blockdev.sh@600 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:43.420 Malloc_STAT 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # waitforbdev Malloc_STAT 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:43.420 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:43.421 [ 00:12:43.421 { 00:12:43.421 "name": "Malloc_STAT", 00:12:43.421 "aliases": [ 00:12:43.421 "0ba9302d-b6b4-4cbf-bdf1-f7447bcbd175" 00:12:43.421 ], 00:12:43.421 "product_name": "Malloc disk", 00:12:43.421 "block_size": 512, 00:12:43.421 "num_blocks": 262144, 00:12:43.421 "uuid": "0ba9302d-b6b4-4cbf-bdf1-f7447bcbd175", 00:12:43.421 "assigned_rate_limits": { 00:12:43.421 "rw_ios_per_sec": 0, 00:12:43.421 "rw_mbytes_per_sec": 0, 00:12:43.421 "r_mbytes_per_sec": 0, 00:12:43.421 "w_mbytes_per_sec": 0 00:12:43.421 }, 00:12:43.421 "claimed": false, 00:12:43.421 "zoned": false, 00:12:43.421 "supported_io_types": { 00:12:43.421 "read": true, 00:12:43.421 "write": true, 00:12:43.421 "unmap": true, 00:12:43.421 "flush": true, 00:12:43.421 "reset": true, 00:12:43.421 "nvme_admin": false, 00:12:43.421 "nvme_io": false, 00:12:43.421 "nvme_io_md": false, 00:12:43.421 "write_zeroes": true, 00:12:43.421 "zcopy": true, 00:12:43.421 "get_zone_info": false, 00:12:43.421 "zone_management": false, 00:12:43.421 "zone_append": false, 00:12:43.421 "compare": false, 00:12:43.421 "compare_and_write": false, 00:12:43.421 "abort": true, 00:12:43.421 "seek_hole": false, 00:12:43.421 "seek_data": false, 00:12:43.421 "copy": true, 00:12:43.421 "nvme_iov_md": false 00:12:43.421 }, 00:12:43.421 "memory_domains": [ 00:12:43.421 { 00:12:43.421 "dma_device_id": "system", 00:12:43.421 "dma_device_type": 1 00:12:43.421 }, 00:12:43.421 { 00:12:43.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.421 "dma_device_type": 2 00:12:43.421 } 00:12:43.421 ], 00:12:43.421 "driver_specific": {} 00:12:43.421 } 00:12:43.421 ] 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # sleep 2 00:12:43.421 05:40:58 blockdev_general.bdev_stat -- bdev/blockdev.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:12:43.680 Running I/O for 10 seconds... 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # stat_function_test Malloc_STAT 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@558 -- # local bdev_name=Malloc_STAT 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local iostats 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local io_count1 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count2 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local iostats_per_channel 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local io_count_per_channel1 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel2 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel_all=0 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # iostats='{ 00:12:45.585 "tick_rate": 2300000000, 00:12:45.585 "ticks": 7475829275971250, 00:12:45.585 "bdevs": [ 00:12:45.585 { 00:12:45.585 "name": "Malloc_STAT", 00:12:45.585 "bytes_read": 745583104, 00:12:45.585 "num_read_ops": 182020, 00:12:45.585 "bytes_written": 0, 00:12:45.585 "num_write_ops": 0, 00:12:45.585 "bytes_unmapped": 0, 00:12:45.585 "num_unmap_ops": 0, 00:12:45.585 "bytes_copied": 0, 00:12:45.585 "num_copy_ops": 0, 00:12:45.585 "read_latency_ticks": 2227718949064, 00:12:45.585 "max_read_latency_ticks": 14776598, 00:12:45.585 "min_read_latency_ticks": 275566, 00:12:45.585 "write_latency_ticks": 0, 00:12:45.585 "max_write_latency_ticks": 0, 00:12:45.585 "min_write_latency_ticks": 0, 00:12:45.585 "unmap_latency_ticks": 0, 00:12:45.585 "max_unmap_latency_ticks": 0, 00:12:45.585 "min_unmap_latency_ticks": 0, 00:12:45.585 "copy_latency_ticks": 0, 00:12:45.585 "max_copy_latency_ticks": 0, 00:12:45.585 "min_copy_latency_ticks": 0, 00:12:45.585 "io_error": {} 00:12:45.585 } 00:12:45.585 ] 00:12:45.585 }' 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # jq -r '.bdevs[0].num_read_ops' 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # io_count1=182020 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # iostats_per_channel='{ 00:12:45.585 "tick_rate": 2300000000, 00:12:45.585 "ticks": 7475829436536260, 00:12:45.585 "name": "Malloc_STAT", 00:12:45.585 "channels": [ 00:12:45.585 { 00:12:45.585 "thread_id": 2, 00:12:45.585 "bytes_read": 384827392, 00:12:45.585 "num_read_ops": 93952, 00:12:45.585 "bytes_written": 0, 00:12:45.585 "num_write_ops": 0, 00:12:45.585 "bytes_unmapped": 0, 00:12:45.585 "num_unmap_ops": 0, 00:12:45.585 "bytes_copied": 0, 00:12:45.585 "num_copy_ops": 0, 00:12:45.585 "read_latency_ticks": 1154682521172, 00:12:45.585 "max_read_latency_ticks": 13682446, 00:12:45.585 "min_read_latency_ticks": 8058408, 00:12:45.585 "write_latency_ticks": 0, 00:12:45.585 "max_write_latency_ticks": 0, 00:12:45.585 "min_write_latency_ticks": 0, 00:12:45.585 "unmap_latency_ticks": 0, 00:12:45.585 "max_unmap_latency_ticks": 0, 00:12:45.585 "min_unmap_latency_ticks": 0, 00:12:45.585 "copy_latency_ticks": 0, 00:12:45.585 "max_copy_latency_ticks": 0, 00:12:45.585 "min_copy_latency_ticks": 0 00:12:45.585 }, 00:12:45.585 { 00:12:45.585 "thread_id": 3, 00:12:45.585 "bytes_read": 389021696, 00:12:45.585 "num_read_ops": 94976, 00:12:45.585 "bytes_written": 0, 00:12:45.585 "num_write_ops": 0, 00:12:45.585 "bytes_unmapped": 0, 00:12:45.585 "num_unmap_ops": 0, 00:12:45.585 "bytes_copied": 0, 00:12:45.585 "num_copy_ops": 0, 00:12:45.585 "read_latency_ticks": 1157561081684, 00:12:45.585 "max_read_latency_ticks": 14776598, 00:12:45.585 "min_read_latency_ticks": 8082462, 00:12:45.585 "write_latency_ticks": 0, 00:12:45.585 "max_write_latency_ticks": 0, 00:12:45.585 "min_write_latency_ticks": 0, 00:12:45.585 "unmap_latency_ticks": 0, 00:12:45.585 "max_unmap_latency_ticks": 0, 00:12:45.585 "min_unmap_latency_ticks": 0, 00:12:45.585 "copy_latency_ticks": 0, 00:12:45.585 "max_copy_latency_ticks": 0, 00:12:45.585 "min_copy_latency_ticks": 0 00:12:45.585 } 00:12:45.585 ] 00:12:45.585 }' 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # jq -r '.channels[0].num_read_ops' 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # io_count_per_channel1=93952 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel_all=93952 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # jq -r '.channels[1].num_read_ops' 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel2=94976 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel_all=188928 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:45.585 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # iostats='{ 00:12:45.845 "tick_rate": 2300000000, 00:12:45.845 "ticks": 7475829707548980, 00:12:45.845 "bdevs": [ 00:12:45.845 { 00:12:45.845 "name": "Malloc_STAT", 00:12:45.845 "bytes_read": 820032000, 00:12:45.845 "num_read_ops": 200196, 00:12:45.845 "bytes_written": 0, 00:12:45.845 "num_write_ops": 0, 00:12:45.845 "bytes_unmapped": 0, 00:12:45.845 "num_unmap_ops": 0, 00:12:45.845 "bytes_copied": 0, 00:12:45.845 "num_copy_ops": 0, 00:12:45.845 "read_latency_ticks": 2450061775546, 00:12:45.845 "max_read_latency_ticks": 14776598, 00:12:45.845 "min_read_latency_ticks": 275566, 00:12:45.845 "write_latency_ticks": 0, 00:12:45.845 "max_write_latency_ticks": 0, 00:12:45.845 "min_write_latency_ticks": 0, 00:12:45.845 "unmap_latency_ticks": 0, 00:12:45.845 "max_unmap_latency_ticks": 0, 00:12:45.845 "min_unmap_latency_ticks": 0, 00:12:45.845 "copy_latency_ticks": 0, 00:12:45.845 "max_copy_latency_ticks": 0, 00:12:45.845 "min_copy_latency_ticks": 0, 00:12:45.845 "io_error": {} 00:12:45.845 } 00:12:45.845 ] 00:12:45.845 }' 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # jq -r '.bdevs[0].num_read_ops' 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # io_count2=200196 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 188928 -lt 182020 ']' 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 188928 -gt 200196 ']' 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@607 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:45.845 00:12:45.845 Latency(us) 00:12:45.845 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:45.845 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:12:45.845 Malloc_STAT : 2.16 47829.14 186.83 0.00 0.00 5339.45 1424.70 5955.23 00:12:45.845 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:12:45.845 Malloc_STAT : 2.16 48272.20 188.56 0.00 0.00 5291.07 940.30 6439.62 00:12:45.845 =================================================================================================================== 00:12:45.845 Total : 96101.33 375.40 0.00 0.00 5315.14 940.30 6439.62 00:12:45.845 0 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # killprocess 1124065 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 1124065 ']' 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 1124065 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1124065 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1124065' 00:12:45.845 killing process with pid 1124065 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 1124065 00:12:45.845 Received shutdown signal, test time was about 2.246182 seconds 00:12:45.845 00:12:45.845 Latency(us) 00:12:45.845 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:45.845 =================================================================================================================== 00:12:45.845 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:12:45.845 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 1124065 00:12:46.105 05:41:00 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # trap - SIGINT SIGTERM EXIT 00:12:46.105 00:12:46.105 real 0m3.608s 00:12:46.105 user 0m7.138s 00:12:46.105 sys 0m0.482s 00:12:46.105 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:46.105 05:41:00 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:46.105 ************************************ 00:12:46.105 END TEST bdev_stat 00:12:46.105 ************************************ 00:12:46.105 05:41:00 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@793 -- # [[ bdev == gpt ]] 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@797 -- # [[ bdev == crypto_sw ]] 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@810 -- # cleanup 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:12:46.105 05:41:00 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:12:46.105 00:12:46.105 real 1m55.704s 00:12:46.105 user 7m9.907s 00:12:46.105 sys 0m22.298s 00:12:46.105 05:41:00 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:46.105 05:41:00 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:46.105 ************************************ 00:12:46.105 END TEST blockdev_general 00:12:46.105 ************************************ 00:12:46.105 05:41:00 -- common/autotest_common.sh@1142 -- # return 0 00:12:46.105 05:41:00 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:12:46.105 05:41:00 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:46.105 05:41:00 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:46.105 05:41:00 -- common/autotest_common.sh@10 -- # set +x 00:12:46.105 ************************************ 00:12:46.105 START TEST bdev_raid 00:12:46.105 ************************************ 00:12:46.105 05:41:00 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:12:46.364 * Looking for test storage... 00:12:46.364 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:12:46.364 05:41:01 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:12:46.364 05:41:01 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:12:46.364 05:41:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:12:46.364 05:41:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:46.364 05:41:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:46.364 ************************************ 00:12:46.364 START TEST raid_function_test_raid0 00:12:46.364 ************************************ 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=1124673 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1124673' 00:12:46.364 Process raid pid: 1124673 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 1124673 /var/tmp/spdk-raid.sock 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 1124673 ']' 00:12:46.364 05:41:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:46.365 05:41:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:46.365 05:41:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:46.365 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:46.365 05:41:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:46.365 05:41:01 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:12:46.365 [2024-07-26 05:41:01.231170] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:46.365 [2024-07-26 05:41:01.231248] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:46.624 [2024-07-26 05:41:01.362155] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:46.624 [2024-07-26 05:41:01.464009] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:46.624 [2024-07-26 05:41:01.527783] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:46.624 [2024-07-26 05:41:01.527819] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:12:47.562 [2024-07-26 05:41:02.429485] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:47.562 [2024-07-26 05:41:02.431081] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:47.562 [2024-07-26 05:41:02.431143] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24fabd0 00:12:47.562 [2024-07-26 05:41:02.431154] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:47.562 [2024-07-26 05:41:02.431351] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24fab10 00:12:47.562 [2024-07-26 05:41:02.431472] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24fabd0 00:12:47.562 [2024-07-26 05:41:02.431482] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x24fabd0 00:12:47.562 [2024-07-26 05:41:02.431588] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:47.562 Base_1 00:12:47.562 Base_2 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:47.562 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:47.821 05:41:02 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:12:48.388 [2024-07-26 05:41:03.187484] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ae8e0 00:12:48.388 /dev/nbd0 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:12:48.388 1+0 records in 00:12:48.388 1+0 records out 00:12:48.388 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002498 s, 16.4 MB/s 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:48.388 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:48.646 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:12:48.646 { 00:12:48.646 "nbd_device": "/dev/nbd0", 00:12:48.646 "bdev_name": "raid" 00:12:48.646 } 00:12:48.646 ]' 00:12:48.646 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:12:48.646 { 00:12:48.646 "nbd_device": "/dev/nbd0", 00:12:48.646 "bdev_name": "raid" 00:12:48.646 } 00:12:48.646 ]' 00:12:48.646 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:48.646 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:12:48.646 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:12:48.646 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:48.646 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:12:48.646 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:12:48.905 4096+0 records in 00:12:48.905 4096+0 records out 00:12:48.905 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0295598 s, 70.9 MB/s 00:12:48.905 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:12:49.164 4096+0 records in 00:12:49.164 4096+0 records out 00:12:49.164 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.323128 s, 6.5 MB/s 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:12:49.164 128+0 records in 00:12:49.164 128+0 records out 00:12:49.164 65536 bytes (66 kB, 64 KiB) copied, 0.000838089 s, 78.2 MB/s 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:12:49.164 2035+0 records in 00:12:49.164 2035+0 records out 00:12:49.164 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0106056 s, 98.2 MB/s 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:49.164 05:41:03 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:12:49.164 456+0 records in 00:12:49.164 456+0 records out 00:12:49.164 233472 bytes (233 kB, 228 KiB) copied, 0.0027111 s, 86.1 MB/s 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:12:49.164 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:12:49.165 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:49.165 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:12:49.165 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:12:49.165 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:12:49.165 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:12:49.165 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:12:49.424 [2024-07-26 05:41:04.296715] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:49.424 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:49.683 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:12:49.683 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:12:49.683 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 1124673 00:12:49.942 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 1124673 ']' 00:12:49.943 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 1124673 00:12:49.943 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:12:49.943 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:49.943 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1124673 00:12:49.943 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:49.943 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:49.943 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1124673' 00:12:49.943 killing process with pid 1124673 00:12:49.943 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 1124673 00:12:49.943 [2024-07-26 05:41:04.670182] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:49.943 [2024-07-26 05:41:04.670252] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:49.943 [2024-07-26 05:41:04.670295] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:49.943 [2024-07-26 05:41:04.670315] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24fabd0 name raid, state offline 00:12:49.943 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 1124673 00:12:49.943 [2024-07-26 05:41:04.687856] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:50.203 05:41:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:12:50.203 00:12:50.203 real 0m3.743s 00:12:50.203 user 0m4.993s 00:12:50.203 sys 0m1.375s 00:12:50.203 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:50.203 05:41:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:12:50.203 ************************************ 00:12:50.203 END TEST raid_function_test_raid0 00:12:50.203 ************************************ 00:12:50.203 05:41:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:50.203 05:41:04 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:12:50.203 05:41:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:12:50.203 05:41:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:50.203 05:41:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:50.203 ************************************ 00:12:50.203 START TEST raid_function_test_concat 00:12:50.203 ************************************ 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=1125150 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1125150' 00:12:50.203 Process raid pid: 1125150 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 1125150 /var/tmp/spdk-raid.sock 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 1125150 ']' 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:50.203 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:50.203 05:41:04 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:12:50.203 [2024-07-26 05:41:05.048115] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:50.203 [2024-07-26 05:41:05.048178] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:50.462 [2024-07-26 05:41:05.180071] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:50.462 [2024-07-26 05:41:05.286768] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:50.462 [2024-07-26 05:41:05.357406] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:50.462 [2024-07-26 05:41:05.357442] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:51.404 05:41:05 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:51.404 05:41:05 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:12:51.404 05:41:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:12:51.404 05:41:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:12:51.404 05:41:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:51.404 05:41:05 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:12:51.404 05:41:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:12:51.404 [2024-07-26 05:41:06.258771] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:51.404 [2024-07-26 05:41:06.260231] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:51.404 [2024-07-26 05:41:06.260288] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x273ebd0 00:12:51.404 [2024-07-26 05:41:06.260299] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:51.404 [2024-07-26 05:41:06.260485] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x273eb10 00:12:51.404 [2024-07-26 05:41:06.260604] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x273ebd0 00:12:51.404 [2024-07-26 05:41:06.260614] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x273ebd0 00:12:51.404 [2024-07-26 05:41:06.260721] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:51.404 Base_1 00:12:51.404 Base_2 00:12:51.404 05:41:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:51.404 05:41:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:51.404 05:41:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:51.669 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:12:51.928 [2024-07-26 05:41:06.756088] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28f28e0 00:12:51.928 /dev/nbd0 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:12:51.928 1+0 records in 00:12:51.928 1+0 records out 00:12:51.928 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262151 s, 15.6 MB/s 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:51.928 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:52.187 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:12:52.187 { 00:12:52.187 "nbd_device": "/dev/nbd0", 00:12:52.187 "bdev_name": "raid" 00:12:52.187 } 00:12:52.187 ]' 00:12:52.187 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:52.187 05:41:06 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:12:52.187 { 00:12:52.187 "nbd_device": "/dev/nbd0", 00:12:52.187 "bdev_name": "raid" 00:12:52.187 } 00:12:52.187 ]' 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:12:52.187 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:12:52.445 4096+0 records in 00:12:52.445 4096+0 records out 00:12:52.445 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0306707 s, 68.4 MB/s 00:12:52.445 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:12:52.703 4096+0 records in 00:12:52.703 4096+0 records out 00:12:52.703 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.322148 s, 6.5 MB/s 00:12:52.703 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:12:52.703 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:52.703 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:12:52.703 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:52.703 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:12:52.703 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:12:52.703 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:12:52.703 128+0 records in 00:12:52.703 128+0 records out 00:12:52.704 65536 bytes (66 kB, 64 KiB) copied, 0.000830598 s, 78.9 MB/s 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:12:52.704 2035+0 records in 00:12:52.704 2035+0 records out 00:12:52.704 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0106266 s, 98.0 MB/s 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:12:52.704 456+0 records in 00:12:52.704 456+0 records out 00:12:52.704 233472 bytes (233 kB, 228 KiB) copied, 0.00274421 s, 85.1 MB/s 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:12:52.704 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:12:52.962 [2024-07-26 05:41:07.780267] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:52.962 05:41:07 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 1125150 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 1125150 ']' 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 1125150 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:53.221 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1125150 00:12:53.480 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:53.480 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:53.480 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1125150' 00:12:53.480 killing process with pid 1125150 00:12:53.480 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 1125150 00:12:53.480 [2024-07-26 05:41:08.142477] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:53.480 [2024-07-26 05:41:08.142543] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:53.480 [2024-07-26 05:41:08.142588] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:53.480 [2024-07-26 05:41:08.142605] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x273ebd0 name raid, state offline 00:12:53.480 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 1125150 00:12:53.480 [2024-07-26 05:41:08.159783] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:53.480 05:41:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:12:53.480 00:12:53.480 real 0m3.394s 00:12:53.480 user 0m4.438s 00:12:53.480 sys 0m1.241s 00:12:53.480 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:53.480 05:41:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:12:53.480 ************************************ 00:12:53.480 END TEST raid_function_test_concat 00:12:53.480 ************************************ 00:12:53.739 05:41:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:53.739 05:41:08 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:12:53.739 05:41:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:53.739 05:41:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:53.739 05:41:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:53.739 ************************************ 00:12:53.739 START TEST raid0_resize_test 00:12:53.739 ************************************ 00:12:53.739 05:41:08 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:12:53.739 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:12:53.739 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:12:53.739 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=1125742 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 1125742' 00:12:53.740 Process raid pid: 1125742 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 1125742 /var/tmp/spdk-raid.sock 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 1125742 ']' 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:53.740 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:53.740 05:41:08 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.740 [2024-07-26 05:41:08.512110] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:53.740 [2024-07-26 05:41:08.512172] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:53.740 [2024-07-26 05:41:08.640159] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:53.999 [2024-07-26 05:41:08.743663] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:53.999 [2024-07-26 05:41:08.801533] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:53.999 [2024-07-26 05:41:08.801573] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:54.566 05:41:09 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:54.566 05:41:09 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:12:54.566 05:41:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:12:54.825 Base_1 00:12:54.825 05:41:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:12:55.084 Base_2 00:12:55.084 05:41:09 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:12:55.343 [2024-07-26 05:41:10.175044] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:55.343 [2024-07-26 05:41:10.176403] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:55.343 [2024-07-26 05:41:10.176453] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x139b780 00:12:55.343 [2024-07-26 05:41:10.176462] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:55.343 [2024-07-26 05:41:10.176675] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xee7020 00:12:55.343 [2024-07-26 05:41:10.176769] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x139b780 00:12:55.343 [2024-07-26 05:41:10.176779] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x139b780 00:12:55.343 [2024-07-26 05:41:10.176884] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:55.343 05:41:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:12:55.601 [2024-07-26 05:41:10.419674] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:55.601 [2024-07-26 05:41:10.419697] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:12:55.601 true 00:12:55.601 05:41:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:55.601 05:41:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:12:55.860 [2024-07-26 05:41:10.668483] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:55.860 05:41:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:12:55.860 05:41:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:12:55.860 05:41:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:12:55.860 05:41:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:12:56.119 [2024-07-26 05:41:10.916957] bdev_raid.c:2288:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:56.119 [2024-07-26 05:41:10.916975] bdev_raid.c:2301:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:12:56.119 [2024-07-26 05:41:10.916998] bdev_raid.c:2315:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:12:56.119 true 00:12:56.119 05:41:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:56.119 05:41:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:12:56.378 [2024-07-26 05:41:11.165768] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 1125742 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 1125742 ']' 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 1125742 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1125742 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1125742' 00:12:56.379 killing process with pid 1125742 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 1125742 00:12:56.379 [2024-07-26 05:41:11.237135] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:56.379 [2024-07-26 05:41:11.237190] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:56.379 [2024-07-26 05:41:11.237232] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:56.379 [2024-07-26 05:41:11.237243] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x139b780 name Raid, state offline 00:12:56.379 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 1125742 00:12:56.379 [2024-07-26 05:41:11.238604] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:56.638 05:41:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:12:56.638 00:12:56.638 real 0m2.983s 00:12:56.638 user 0m4.626s 00:12:56.638 sys 0m0.621s 00:12:56.638 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:56.638 05:41:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.638 ************************************ 00:12:56.638 END TEST raid0_resize_test 00:12:56.638 ************************************ 00:12:56.638 05:41:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:56.638 05:41:11 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:12:56.638 05:41:11 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:56.638 05:41:11 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:12:56.638 05:41:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:56.638 05:41:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:56.638 05:41:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:56.638 ************************************ 00:12:56.638 START TEST raid_state_function_test 00:12:56.638 ************************************ 00:12:56.638 05:41:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:12:56.638 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:56.638 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:56.638 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:56.638 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:56.638 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:56.638 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:56.638 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:56.638 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1126135 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1126135' 00:12:56.639 Process raid pid: 1126135 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1126135 /var/tmp/spdk-raid.sock 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1126135 ']' 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:56.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:56.639 05:41:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.898 [2024-07-26 05:41:11.589568] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:12:56.898 [2024-07-26 05:41:11.589630] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:56.898 [2024-07-26 05:41:11.721751] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:57.157 [2024-07-26 05:41:11.825881] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:57.157 [2024-07-26 05:41:11.889240] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:57.157 [2024-07-26 05:41:11.889271] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:57.726 05:41:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:57.726 05:41:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:57.726 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:57.985 [2024-07-26 05:41:12.744787] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:57.985 [2024-07-26 05:41:12.744824] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:57.985 [2024-07-26 05:41:12.744835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:57.985 [2024-07-26 05:41:12.744846] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.985 05:41:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:58.244 05:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:58.244 "name": "Existed_Raid", 00:12:58.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.244 "strip_size_kb": 64, 00:12:58.244 "state": "configuring", 00:12:58.244 "raid_level": "raid0", 00:12:58.244 "superblock": false, 00:12:58.244 "num_base_bdevs": 2, 00:12:58.244 "num_base_bdevs_discovered": 0, 00:12:58.244 "num_base_bdevs_operational": 2, 00:12:58.244 "base_bdevs_list": [ 00:12:58.244 { 00:12:58.244 "name": "BaseBdev1", 00:12:58.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.245 "is_configured": false, 00:12:58.245 "data_offset": 0, 00:12:58.245 "data_size": 0 00:12:58.245 }, 00:12:58.245 { 00:12:58.245 "name": "BaseBdev2", 00:12:58.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:58.245 "is_configured": false, 00:12:58.245 "data_offset": 0, 00:12:58.245 "data_size": 0 00:12:58.245 } 00:12:58.245 ] 00:12:58.245 }' 00:12:58.245 05:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:58.245 05:41:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.812 05:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:59.070 [2024-07-26 05:41:13.835536] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:59.070 [2024-07-26 05:41:13.835565] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd60a80 name Existed_Raid, state configuring 00:12:59.070 05:41:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:59.329 [2024-07-26 05:41:14.080186] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:59.329 [2024-07-26 05:41:14.080211] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:59.329 [2024-07-26 05:41:14.080220] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:59.329 [2024-07-26 05:41:14.080231] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:59.329 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:59.587 [2024-07-26 05:41:14.334631] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:59.587 BaseBdev1 00:12:59.587 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:59.587 05:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:59.587 05:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:59.587 05:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:59.587 05:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:59.587 05:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:59.587 05:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:59.845 05:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:00.103 [ 00:13:00.103 { 00:13:00.103 "name": "BaseBdev1", 00:13:00.103 "aliases": [ 00:13:00.103 "668af008-8951-47ab-b025-ebd5b4727641" 00:13:00.103 ], 00:13:00.103 "product_name": "Malloc disk", 00:13:00.103 "block_size": 512, 00:13:00.103 "num_blocks": 65536, 00:13:00.103 "uuid": "668af008-8951-47ab-b025-ebd5b4727641", 00:13:00.103 "assigned_rate_limits": { 00:13:00.103 "rw_ios_per_sec": 0, 00:13:00.103 "rw_mbytes_per_sec": 0, 00:13:00.103 "r_mbytes_per_sec": 0, 00:13:00.103 "w_mbytes_per_sec": 0 00:13:00.103 }, 00:13:00.103 "claimed": true, 00:13:00.103 "claim_type": "exclusive_write", 00:13:00.103 "zoned": false, 00:13:00.103 "supported_io_types": { 00:13:00.103 "read": true, 00:13:00.103 "write": true, 00:13:00.103 "unmap": true, 00:13:00.103 "flush": true, 00:13:00.103 "reset": true, 00:13:00.103 "nvme_admin": false, 00:13:00.103 "nvme_io": false, 00:13:00.103 "nvme_io_md": false, 00:13:00.103 "write_zeroes": true, 00:13:00.103 "zcopy": true, 00:13:00.103 "get_zone_info": false, 00:13:00.103 "zone_management": false, 00:13:00.103 "zone_append": false, 00:13:00.103 "compare": false, 00:13:00.104 "compare_and_write": false, 00:13:00.104 "abort": true, 00:13:00.104 "seek_hole": false, 00:13:00.104 "seek_data": false, 00:13:00.104 "copy": true, 00:13:00.104 "nvme_iov_md": false 00:13:00.104 }, 00:13:00.104 "memory_domains": [ 00:13:00.104 { 00:13:00.104 "dma_device_id": "system", 00:13:00.104 "dma_device_type": 1 00:13:00.104 }, 00:13:00.104 { 00:13:00.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:00.104 "dma_device_type": 2 00:13:00.104 } 00:13:00.104 ], 00:13:00.104 "driver_specific": {} 00:13:00.104 } 00:13:00.104 ] 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.104 05:41:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.362 05:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.362 "name": "Existed_Raid", 00:13:00.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.362 "strip_size_kb": 64, 00:13:00.362 "state": "configuring", 00:13:00.362 "raid_level": "raid0", 00:13:00.362 "superblock": false, 00:13:00.362 "num_base_bdevs": 2, 00:13:00.362 "num_base_bdevs_discovered": 1, 00:13:00.362 "num_base_bdevs_operational": 2, 00:13:00.362 "base_bdevs_list": [ 00:13:00.362 { 00:13:00.362 "name": "BaseBdev1", 00:13:00.362 "uuid": "668af008-8951-47ab-b025-ebd5b4727641", 00:13:00.362 "is_configured": true, 00:13:00.362 "data_offset": 0, 00:13:00.362 "data_size": 65536 00:13:00.362 }, 00:13:00.362 { 00:13:00.362 "name": "BaseBdev2", 00:13:00.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.362 "is_configured": false, 00:13:00.362 "data_offset": 0, 00:13:00.362 "data_size": 0 00:13:00.362 } 00:13:00.362 ] 00:13:00.362 }' 00:13:00.362 05:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.362 05:41:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.929 05:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:01.187 [2024-07-26 05:41:15.902790] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:01.187 [2024-07-26 05:41:15.902826] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd60350 name Existed_Raid, state configuring 00:13:01.187 05:41:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:01.187 [2024-07-26 05:41:16.091314] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:01.187 [2024-07-26 05:41:16.092837] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:01.187 [2024-07-26 05:41:16.092872] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.445 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:01.703 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.703 "name": "Existed_Raid", 00:13:01.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:01.703 "strip_size_kb": 64, 00:13:01.703 "state": "configuring", 00:13:01.703 "raid_level": "raid0", 00:13:01.703 "superblock": false, 00:13:01.703 "num_base_bdevs": 2, 00:13:01.703 "num_base_bdevs_discovered": 1, 00:13:01.703 "num_base_bdevs_operational": 2, 00:13:01.703 "base_bdevs_list": [ 00:13:01.703 { 00:13:01.703 "name": "BaseBdev1", 00:13:01.703 "uuid": "668af008-8951-47ab-b025-ebd5b4727641", 00:13:01.703 "is_configured": true, 00:13:01.703 "data_offset": 0, 00:13:01.703 "data_size": 65536 00:13:01.703 }, 00:13:01.703 { 00:13:01.703 "name": "BaseBdev2", 00:13:01.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:01.703 "is_configured": false, 00:13:01.703 "data_offset": 0, 00:13:01.703 "data_size": 0 00:13:01.703 } 00:13:01.703 ] 00:13:01.703 }' 00:13:01.703 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.703 05:41:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.270 05:41:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:02.528 [2024-07-26 05:41:17.197571] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:02.528 [2024-07-26 05:41:17.197603] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xd61000 00:13:02.528 [2024-07-26 05:41:17.197611] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:13:02.528 [2024-07-26 05:41:17.197813] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7b0c0 00:13:02.528 [2024-07-26 05:41:17.197940] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd61000 00:13:02.528 [2024-07-26 05:41:17.197951] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd61000 00:13:02.528 [2024-07-26 05:41:17.198109] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:02.528 BaseBdev2 00:13:02.528 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:02.528 05:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:02.528 05:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:02.528 05:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:02.528 05:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:02.528 05:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:02.528 05:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:02.786 05:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:03.044 [ 00:13:03.044 { 00:13:03.044 "name": "BaseBdev2", 00:13:03.044 "aliases": [ 00:13:03.044 "e7af1935-89d5-433a-8ecb-6bf5dd3962de" 00:13:03.044 ], 00:13:03.044 "product_name": "Malloc disk", 00:13:03.044 "block_size": 512, 00:13:03.044 "num_blocks": 65536, 00:13:03.044 "uuid": "e7af1935-89d5-433a-8ecb-6bf5dd3962de", 00:13:03.044 "assigned_rate_limits": { 00:13:03.044 "rw_ios_per_sec": 0, 00:13:03.044 "rw_mbytes_per_sec": 0, 00:13:03.044 "r_mbytes_per_sec": 0, 00:13:03.044 "w_mbytes_per_sec": 0 00:13:03.044 }, 00:13:03.044 "claimed": true, 00:13:03.044 "claim_type": "exclusive_write", 00:13:03.044 "zoned": false, 00:13:03.044 "supported_io_types": { 00:13:03.044 "read": true, 00:13:03.044 "write": true, 00:13:03.044 "unmap": true, 00:13:03.044 "flush": true, 00:13:03.044 "reset": true, 00:13:03.044 "nvme_admin": false, 00:13:03.044 "nvme_io": false, 00:13:03.044 "nvme_io_md": false, 00:13:03.044 "write_zeroes": true, 00:13:03.044 "zcopy": true, 00:13:03.044 "get_zone_info": false, 00:13:03.044 "zone_management": false, 00:13:03.044 "zone_append": false, 00:13:03.044 "compare": false, 00:13:03.044 "compare_and_write": false, 00:13:03.044 "abort": true, 00:13:03.044 "seek_hole": false, 00:13:03.044 "seek_data": false, 00:13:03.044 "copy": true, 00:13:03.044 "nvme_iov_md": false 00:13:03.044 }, 00:13:03.044 "memory_domains": [ 00:13:03.044 { 00:13:03.044 "dma_device_id": "system", 00:13:03.044 "dma_device_type": 1 00:13:03.044 }, 00:13:03.044 { 00:13:03.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:03.044 "dma_device_type": 2 00:13:03.044 } 00:13:03.044 ], 00:13:03.044 "driver_specific": {} 00:13:03.044 } 00:13:03.044 ] 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.044 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.045 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.045 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.045 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:03.303 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.303 "name": "Existed_Raid", 00:13:03.303 "uuid": "3c6ef035-17d8-4bf0-ad9d-4e5e3d6193a8", 00:13:03.303 "strip_size_kb": 64, 00:13:03.303 "state": "online", 00:13:03.303 "raid_level": "raid0", 00:13:03.303 "superblock": false, 00:13:03.303 "num_base_bdevs": 2, 00:13:03.303 "num_base_bdevs_discovered": 2, 00:13:03.303 "num_base_bdevs_operational": 2, 00:13:03.303 "base_bdevs_list": [ 00:13:03.303 { 00:13:03.303 "name": "BaseBdev1", 00:13:03.303 "uuid": "668af008-8951-47ab-b025-ebd5b4727641", 00:13:03.303 "is_configured": true, 00:13:03.303 "data_offset": 0, 00:13:03.303 "data_size": 65536 00:13:03.303 }, 00:13:03.303 { 00:13:03.303 "name": "BaseBdev2", 00:13:03.303 "uuid": "e7af1935-89d5-433a-8ecb-6bf5dd3962de", 00:13:03.303 "is_configured": true, 00:13:03.303 "data_offset": 0, 00:13:03.303 "data_size": 65536 00:13:03.303 } 00:13:03.303 ] 00:13:03.303 }' 00:13:03.303 05:41:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.303 05:41:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:03.869 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:03.869 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:03.869 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:03.869 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:03.869 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:03.869 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:03.869 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:03.869 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:04.128 [2024-07-26 05:41:18.794060] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:04.128 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:04.128 "name": "Existed_Raid", 00:13:04.128 "aliases": [ 00:13:04.128 "3c6ef035-17d8-4bf0-ad9d-4e5e3d6193a8" 00:13:04.128 ], 00:13:04.128 "product_name": "Raid Volume", 00:13:04.128 "block_size": 512, 00:13:04.128 "num_blocks": 131072, 00:13:04.128 "uuid": "3c6ef035-17d8-4bf0-ad9d-4e5e3d6193a8", 00:13:04.128 "assigned_rate_limits": { 00:13:04.128 "rw_ios_per_sec": 0, 00:13:04.128 "rw_mbytes_per_sec": 0, 00:13:04.128 "r_mbytes_per_sec": 0, 00:13:04.128 "w_mbytes_per_sec": 0 00:13:04.128 }, 00:13:04.128 "claimed": false, 00:13:04.128 "zoned": false, 00:13:04.128 "supported_io_types": { 00:13:04.128 "read": true, 00:13:04.128 "write": true, 00:13:04.128 "unmap": true, 00:13:04.128 "flush": true, 00:13:04.128 "reset": true, 00:13:04.128 "nvme_admin": false, 00:13:04.128 "nvme_io": false, 00:13:04.128 "nvme_io_md": false, 00:13:04.128 "write_zeroes": true, 00:13:04.128 "zcopy": false, 00:13:04.128 "get_zone_info": false, 00:13:04.128 "zone_management": false, 00:13:04.128 "zone_append": false, 00:13:04.128 "compare": false, 00:13:04.128 "compare_and_write": false, 00:13:04.128 "abort": false, 00:13:04.128 "seek_hole": false, 00:13:04.128 "seek_data": false, 00:13:04.128 "copy": false, 00:13:04.128 "nvme_iov_md": false 00:13:04.128 }, 00:13:04.128 "memory_domains": [ 00:13:04.128 { 00:13:04.128 "dma_device_id": "system", 00:13:04.128 "dma_device_type": 1 00:13:04.128 }, 00:13:04.128 { 00:13:04.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.128 "dma_device_type": 2 00:13:04.128 }, 00:13:04.128 { 00:13:04.128 "dma_device_id": "system", 00:13:04.128 "dma_device_type": 1 00:13:04.128 }, 00:13:04.128 { 00:13:04.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.128 "dma_device_type": 2 00:13:04.128 } 00:13:04.128 ], 00:13:04.128 "driver_specific": { 00:13:04.128 "raid": { 00:13:04.128 "uuid": "3c6ef035-17d8-4bf0-ad9d-4e5e3d6193a8", 00:13:04.128 "strip_size_kb": 64, 00:13:04.128 "state": "online", 00:13:04.128 "raid_level": "raid0", 00:13:04.128 "superblock": false, 00:13:04.128 "num_base_bdevs": 2, 00:13:04.128 "num_base_bdevs_discovered": 2, 00:13:04.128 "num_base_bdevs_operational": 2, 00:13:04.128 "base_bdevs_list": [ 00:13:04.128 { 00:13:04.128 "name": "BaseBdev1", 00:13:04.128 "uuid": "668af008-8951-47ab-b025-ebd5b4727641", 00:13:04.128 "is_configured": true, 00:13:04.128 "data_offset": 0, 00:13:04.128 "data_size": 65536 00:13:04.128 }, 00:13:04.128 { 00:13:04.128 "name": "BaseBdev2", 00:13:04.128 "uuid": "e7af1935-89d5-433a-8ecb-6bf5dd3962de", 00:13:04.128 "is_configured": true, 00:13:04.128 "data_offset": 0, 00:13:04.128 "data_size": 65536 00:13:04.128 } 00:13:04.128 ] 00:13:04.128 } 00:13:04.128 } 00:13:04.128 }' 00:13:04.128 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:04.128 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:04.128 BaseBdev2' 00:13:04.128 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:04.128 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:04.128 05:41:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:04.387 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:04.387 "name": "BaseBdev1", 00:13:04.387 "aliases": [ 00:13:04.387 "668af008-8951-47ab-b025-ebd5b4727641" 00:13:04.387 ], 00:13:04.387 "product_name": "Malloc disk", 00:13:04.387 "block_size": 512, 00:13:04.387 "num_blocks": 65536, 00:13:04.387 "uuid": "668af008-8951-47ab-b025-ebd5b4727641", 00:13:04.387 "assigned_rate_limits": { 00:13:04.387 "rw_ios_per_sec": 0, 00:13:04.387 "rw_mbytes_per_sec": 0, 00:13:04.387 "r_mbytes_per_sec": 0, 00:13:04.387 "w_mbytes_per_sec": 0 00:13:04.387 }, 00:13:04.387 "claimed": true, 00:13:04.387 "claim_type": "exclusive_write", 00:13:04.387 "zoned": false, 00:13:04.387 "supported_io_types": { 00:13:04.387 "read": true, 00:13:04.387 "write": true, 00:13:04.387 "unmap": true, 00:13:04.387 "flush": true, 00:13:04.387 "reset": true, 00:13:04.387 "nvme_admin": false, 00:13:04.387 "nvme_io": false, 00:13:04.387 "nvme_io_md": false, 00:13:04.387 "write_zeroes": true, 00:13:04.387 "zcopy": true, 00:13:04.387 "get_zone_info": false, 00:13:04.387 "zone_management": false, 00:13:04.387 "zone_append": false, 00:13:04.387 "compare": false, 00:13:04.387 "compare_and_write": false, 00:13:04.387 "abort": true, 00:13:04.387 "seek_hole": false, 00:13:04.387 "seek_data": false, 00:13:04.387 "copy": true, 00:13:04.387 "nvme_iov_md": false 00:13:04.387 }, 00:13:04.387 "memory_domains": [ 00:13:04.387 { 00:13:04.387 "dma_device_id": "system", 00:13:04.387 "dma_device_type": 1 00:13:04.387 }, 00:13:04.387 { 00:13:04.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.387 "dma_device_type": 2 00:13:04.387 } 00:13:04.387 ], 00:13:04.387 "driver_specific": {} 00:13:04.387 }' 00:13:04.387 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.387 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.387 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.387 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.387 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.387 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.387 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:04.646 "name": "BaseBdev2", 00:13:04.646 "aliases": [ 00:13:04.646 "e7af1935-89d5-433a-8ecb-6bf5dd3962de" 00:13:04.646 ], 00:13:04.646 "product_name": "Malloc disk", 00:13:04.646 "block_size": 512, 00:13:04.646 "num_blocks": 65536, 00:13:04.646 "uuid": "e7af1935-89d5-433a-8ecb-6bf5dd3962de", 00:13:04.646 "assigned_rate_limits": { 00:13:04.646 "rw_ios_per_sec": 0, 00:13:04.646 "rw_mbytes_per_sec": 0, 00:13:04.646 "r_mbytes_per_sec": 0, 00:13:04.646 "w_mbytes_per_sec": 0 00:13:04.646 }, 00:13:04.646 "claimed": true, 00:13:04.646 "claim_type": "exclusive_write", 00:13:04.646 "zoned": false, 00:13:04.646 "supported_io_types": { 00:13:04.646 "read": true, 00:13:04.646 "write": true, 00:13:04.646 "unmap": true, 00:13:04.646 "flush": true, 00:13:04.646 "reset": true, 00:13:04.646 "nvme_admin": false, 00:13:04.646 "nvme_io": false, 00:13:04.646 "nvme_io_md": false, 00:13:04.646 "write_zeroes": true, 00:13:04.646 "zcopy": true, 00:13:04.646 "get_zone_info": false, 00:13:04.646 "zone_management": false, 00:13:04.646 "zone_append": false, 00:13:04.646 "compare": false, 00:13:04.646 "compare_and_write": false, 00:13:04.646 "abort": true, 00:13:04.646 "seek_hole": false, 00:13:04.646 "seek_data": false, 00:13:04.646 "copy": true, 00:13:04.646 "nvme_iov_md": false 00:13:04.646 }, 00:13:04.646 "memory_domains": [ 00:13:04.646 { 00:13:04.646 "dma_device_id": "system", 00:13:04.646 "dma_device_type": 1 00:13:04.646 }, 00:13:04.646 { 00:13:04.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.646 "dma_device_type": 2 00:13:04.646 } 00:13:04.646 ], 00:13:04.646 "driver_specific": {} 00:13:04.646 }' 00:13:04.646 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.904 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:04.904 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:04.904 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.904 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:04.904 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:04.904 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.904 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:04.904 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:04.904 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.162 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.162 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:05.162 05:41:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:05.455 [2024-07-26 05:41:20.093285] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:05.455 [2024-07-26 05:41:20.093314] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:05.455 [2024-07-26 05:41:20.093357] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.455 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.714 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.714 "name": "Existed_Raid", 00:13:05.714 "uuid": "3c6ef035-17d8-4bf0-ad9d-4e5e3d6193a8", 00:13:05.714 "strip_size_kb": 64, 00:13:05.714 "state": "offline", 00:13:05.714 "raid_level": "raid0", 00:13:05.714 "superblock": false, 00:13:05.714 "num_base_bdevs": 2, 00:13:05.714 "num_base_bdevs_discovered": 1, 00:13:05.714 "num_base_bdevs_operational": 1, 00:13:05.714 "base_bdevs_list": [ 00:13:05.714 { 00:13:05.714 "name": null, 00:13:05.714 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:05.714 "is_configured": false, 00:13:05.714 "data_offset": 0, 00:13:05.714 "data_size": 65536 00:13:05.714 }, 00:13:05.714 { 00:13:05.714 "name": "BaseBdev2", 00:13:05.714 "uuid": "e7af1935-89d5-433a-8ecb-6bf5dd3962de", 00:13:05.714 "is_configured": true, 00:13:05.714 "data_offset": 0, 00:13:05.714 "data_size": 65536 00:13:05.714 } 00:13:05.714 ] 00:13:05.714 }' 00:13:05.714 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.714 05:41:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.282 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:06.282 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:06.282 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.282 05:41:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:06.542 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:06.542 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:06.542 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:06.542 [2024-07-26 05:41:21.429908] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:06.542 [2024-07-26 05:41:21.429957] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd61000 name Existed_Raid, state offline 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1126135 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1126135 ']' 00:13:06.801 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1126135 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1126135 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1126135' 00:13:07.061 killing process with pid 1126135 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1126135 00:13:07.061 [2024-07-26 05:41:21.754958] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1126135 00:13:07.061 [2024-07-26 05:41:21.755829] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:07.061 00:13:07.061 real 0m10.434s 00:13:07.061 user 0m18.575s 00:13:07.061 sys 0m1.944s 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:07.061 05:41:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.061 ************************************ 00:13:07.061 END TEST raid_state_function_test 00:13:07.061 ************************************ 00:13:07.320 05:41:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:07.320 05:41:21 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:13:07.320 05:41:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:07.320 05:41:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:07.320 05:41:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:07.320 ************************************ 00:13:07.320 START TEST raid_state_function_test_sb 00:13:07.320 ************************************ 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1127766 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1127766' 00:13:07.320 Process raid pid: 1127766 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1127766 /var/tmp/spdk-raid.sock 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1127766 ']' 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:07.320 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:07.320 05:41:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:07.320 [2024-07-26 05:41:22.084254] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:13:07.320 [2024-07-26 05:41:22.084313] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:07.320 [2024-07-26 05:41:22.204954] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:07.579 [2024-07-26 05:41:22.314010] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:07.579 [2024-07-26 05:41:22.392350] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.579 [2024-07-26 05:41:22.392385] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:08.515 05:41:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:08.515 05:41:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:08.515 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:08.773 [2024-07-26 05:41:23.520648] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:08.773 [2024-07-26 05:41:23.520686] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:08.773 [2024-07-26 05:41:23.520696] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:08.773 [2024-07-26 05:41:23.520708] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.773 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.030 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.030 "name": "Existed_Raid", 00:13:09.030 "uuid": "29f4ee43-555e-49cb-87f7-3d0288e4f3dd", 00:13:09.030 "strip_size_kb": 64, 00:13:09.030 "state": "configuring", 00:13:09.030 "raid_level": "raid0", 00:13:09.030 "superblock": true, 00:13:09.030 "num_base_bdevs": 2, 00:13:09.030 "num_base_bdevs_discovered": 0, 00:13:09.030 "num_base_bdevs_operational": 2, 00:13:09.030 "base_bdevs_list": [ 00:13:09.030 { 00:13:09.030 "name": "BaseBdev1", 00:13:09.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.030 "is_configured": false, 00:13:09.030 "data_offset": 0, 00:13:09.030 "data_size": 0 00:13:09.030 }, 00:13:09.030 { 00:13:09.030 "name": "BaseBdev2", 00:13:09.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.030 "is_configured": false, 00:13:09.030 "data_offset": 0, 00:13:09.030 "data_size": 0 00:13:09.030 } 00:13:09.030 ] 00:13:09.030 }' 00:13:09.030 05:41:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.030 05:41:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:09.594 05:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:09.852 [2024-07-26 05:41:24.547206] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:09.852 [2024-07-26 05:41:24.547233] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2342a80 name Existed_Raid, state configuring 00:13:09.852 05:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:09.852 [2024-07-26 05:41:24.727721] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:09.853 [2024-07-26 05:41:24.727747] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:09.853 [2024-07-26 05:41:24.727756] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:09.853 [2024-07-26 05:41:24.727767] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:09.853 05:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:10.111 [2024-07-26 05:41:24.910025] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:10.111 BaseBdev1 00:13:10.111 05:41:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:10.111 05:41:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:10.111 05:41:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:10.111 05:41:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:10.111 05:41:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:10.111 05:41:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:10.111 05:41:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:10.368 05:41:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:10.627 [ 00:13:10.627 { 00:13:10.627 "name": "BaseBdev1", 00:13:10.627 "aliases": [ 00:13:10.627 "eedefc26-cfb8-4d1d-94dc-21388f45f6cb" 00:13:10.627 ], 00:13:10.627 "product_name": "Malloc disk", 00:13:10.627 "block_size": 512, 00:13:10.627 "num_blocks": 65536, 00:13:10.627 "uuid": "eedefc26-cfb8-4d1d-94dc-21388f45f6cb", 00:13:10.627 "assigned_rate_limits": { 00:13:10.627 "rw_ios_per_sec": 0, 00:13:10.627 "rw_mbytes_per_sec": 0, 00:13:10.627 "r_mbytes_per_sec": 0, 00:13:10.627 "w_mbytes_per_sec": 0 00:13:10.627 }, 00:13:10.627 "claimed": true, 00:13:10.627 "claim_type": "exclusive_write", 00:13:10.627 "zoned": false, 00:13:10.627 "supported_io_types": { 00:13:10.627 "read": true, 00:13:10.627 "write": true, 00:13:10.627 "unmap": true, 00:13:10.627 "flush": true, 00:13:10.627 "reset": true, 00:13:10.627 "nvme_admin": false, 00:13:10.627 "nvme_io": false, 00:13:10.627 "nvme_io_md": false, 00:13:10.627 "write_zeroes": true, 00:13:10.627 "zcopy": true, 00:13:10.627 "get_zone_info": false, 00:13:10.627 "zone_management": false, 00:13:10.627 "zone_append": false, 00:13:10.627 "compare": false, 00:13:10.627 "compare_and_write": false, 00:13:10.627 "abort": true, 00:13:10.627 "seek_hole": false, 00:13:10.627 "seek_data": false, 00:13:10.627 "copy": true, 00:13:10.627 "nvme_iov_md": false 00:13:10.627 }, 00:13:10.627 "memory_domains": [ 00:13:10.627 { 00:13:10.627 "dma_device_id": "system", 00:13:10.627 "dma_device_type": 1 00:13:10.627 }, 00:13:10.627 { 00:13:10.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.627 "dma_device_type": 2 00:13:10.627 } 00:13:10.627 ], 00:13:10.627 "driver_specific": {} 00:13:10.627 } 00:13:10.627 ] 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.627 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.886 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.886 "name": "Existed_Raid", 00:13:10.886 "uuid": "23eff1ca-3b2a-4498-a1ae-2e74941364af", 00:13:10.886 "strip_size_kb": 64, 00:13:10.886 "state": "configuring", 00:13:10.886 "raid_level": "raid0", 00:13:10.886 "superblock": true, 00:13:10.886 "num_base_bdevs": 2, 00:13:10.886 "num_base_bdevs_discovered": 1, 00:13:10.886 "num_base_bdevs_operational": 2, 00:13:10.886 "base_bdevs_list": [ 00:13:10.886 { 00:13:10.886 "name": "BaseBdev1", 00:13:10.886 "uuid": "eedefc26-cfb8-4d1d-94dc-21388f45f6cb", 00:13:10.886 "is_configured": true, 00:13:10.886 "data_offset": 2048, 00:13:10.886 "data_size": 63488 00:13:10.886 }, 00:13:10.886 { 00:13:10.886 "name": "BaseBdev2", 00:13:10.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.886 "is_configured": false, 00:13:10.886 "data_offset": 0, 00:13:10.886 "data_size": 0 00:13:10.886 } 00:13:10.886 ] 00:13:10.886 }' 00:13:10.886 05:41:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.886 05:41:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:11.452 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:11.452 [2024-07-26 05:41:26.357858] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:11.452 [2024-07-26 05:41:26.357892] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2342350 name Existed_Raid, state configuring 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:11.711 [2024-07-26 05:41:26.538377] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.711 [2024-07-26 05:41:26.539859] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:11.711 [2024-07-26 05:41:26.539889] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.711 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.969 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.969 "name": "Existed_Raid", 00:13:11.969 "uuid": "9cd82e5e-3eb0-4328-b08e-dc558e8e2510", 00:13:11.969 "strip_size_kb": 64, 00:13:11.969 "state": "configuring", 00:13:11.969 "raid_level": "raid0", 00:13:11.969 "superblock": true, 00:13:11.969 "num_base_bdevs": 2, 00:13:11.969 "num_base_bdevs_discovered": 1, 00:13:11.969 "num_base_bdevs_operational": 2, 00:13:11.969 "base_bdevs_list": [ 00:13:11.969 { 00:13:11.969 "name": "BaseBdev1", 00:13:11.969 "uuid": "eedefc26-cfb8-4d1d-94dc-21388f45f6cb", 00:13:11.969 "is_configured": true, 00:13:11.969 "data_offset": 2048, 00:13:11.969 "data_size": 63488 00:13:11.969 }, 00:13:11.969 { 00:13:11.969 "name": "BaseBdev2", 00:13:11.969 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.969 "is_configured": false, 00:13:11.969 "data_offset": 0, 00:13:11.969 "data_size": 0 00:13:11.969 } 00:13:11.969 ] 00:13:11.969 }' 00:13:11.969 05:41:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.969 05:41:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:12.536 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:12.795 [2024-07-26 05:41:27.568454] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:12.795 [2024-07-26 05:41:27.568596] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2343000 00:13:12.795 [2024-07-26 05:41:27.568610] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:12.795 [2024-07-26 05:41:27.568796] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x225d0c0 00:13:12.795 [2024-07-26 05:41:27.568913] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2343000 00:13:12.795 [2024-07-26 05:41:27.568923] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2343000 00:13:12.795 [2024-07-26 05:41:27.569013] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:12.795 BaseBdev2 00:13:12.795 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:12.795 05:41:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:12.795 05:41:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:12.795 05:41:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:12.795 05:41:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:12.795 05:41:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:12.795 05:41:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:13.055 [ 00:13:13.055 { 00:13:13.055 "name": "BaseBdev2", 00:13:13.055 "aliases": [ 00:13:13.055 "1fc07845-fe64-4666-8e0b-7c5fa1922f19" 00:13:13.055 ], 00:13:13.055 "product_name": "Malloc disk", 00:13:13.055 "block_size": 512, 00:13:13.055 "num_blocks": 65536, 00:13:13.055 "uuid": "1fc07845-fe64-4666-8e0b-7c5fa1922f19", 00:13:13.055 "assigned_rate_limits": { 00:13:13.055 "rw_ios_per_sec": 0, 00:13:13.055 "rw_mbytes_per_sec": 0, 00:13:13.055 "r_mbytes_per_sec": 0, 00:13:13.055 "w_mbytes_per_sec": 0 00:13:13.055 }, 00:13:13.055 "claimed": true, 00:13:13.055 "claim_type": "exclusive_write", 00:13:13.055 "zoned": false, 00:13:13.055 "supported_io_types": { 00:13:13.055 "read": true, 00:13:13.055 "write": true, 00:13:13.055 "unmap": true, 00:13:13.055 "flush": true, 00:13:13.055 "reset": true, 00:13:13.055 "nvme_admin": false, 00:13:13.055 "nvme_io": false, 00:13:13.055 "nvme_io_md": false, 00:13:13.055 "write_zeroes": true, 00:13:13.055 "zcopy": true, 00:13:13.055 "get_zone_info": false, 00:13:13.055 "zone_management": false, 00:13:13.055 "zone_append": false, 00:13:13.055 "compare": false, 00:13:13.055 "compare_and_write": false, 00:13:13.055 "abort": true, 00:13:13.055 "seek_hole": false, 00:13:13.055 "seek_data": false, 00:13:13.055 "copy": true, 00:13:13.055 "nvme_iov_md": false 00:13:13.055 }, 00:13:13.055 "memory_domains": [ 00:13:13.055 { 00:13:13.055 "dma_device_id": "system", 00:13:13.055 "dma_device_type": 1 00:13:13.055 }, 00:13:13.055 { 00:13:13.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.055 "dma_device_type": 2 00:13:13.055 } 00:13:13.055 ], 00:13:13.055 "driver_specific": {} 00:13:13.055 } 00:13:13.055 ] 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.055 05:41:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.313 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.313 "name": "Existed_Raid", 00:13:13.313 "uuid": "9cd82e5e-3eb0-4328-b08e-dc558e8e2510", 00:13:13.313 "strip_size_kb": 64, 00:13:13.313 "state": "online", 00:13:13.313 "raid_level": "raid0", 00:13:13.313 "superblock": true, 00:13:13.313 "num_base_bdevs": 2, 00:13:13.313 "num_base_bdevs_discovered": 2, 00:13:13.313 "num_base_bdevs_operational": 2, 00:13:13.313 "base_bdevs_list": [ 00:13:13.313 { 00:13:13.313 "name": "BaseBdev1", 00:13:13.313 "uuid": "eedefc26-cfb8-4d1d-94dc-21388f45f6cb", 00:13:13.313 "is_configured": true, 00:13:13.313 "data_offset": 2048, 00:13:13.313 "data_size": 63488 00:13:13.313 }, 00:13:13.313 { 00:13:13.313 "name": "BaseBdev2", 00:13:13.313 "uuid": "1fc07845-fe64-4666-8e0b-7c5fa1922f19", 00:13:13.313 "is_configured": true, 00:13:13.313 "data_offset": 2048, 00:13:13.313 "data_size": 63488 00:13:13.313 } 00:13:13.313 ] 00:13:13.313 }' 00:13:13.313 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.313 05:41:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:14.247 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:14.247 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:14.247 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:14.247 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:14.247 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:14.247 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:14.247 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:14.247 05:41:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:14.247 [2024-07-26 05:41:29.020575] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:14.247 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:14.247 "name": "Existed_Raid", 00:13:14.247 "aliases": [ 00:13:14.247 "9cd82e5e-3eb0-4328-b08e-dc558e8e2510" 00:13:14.247 ], 00:13:14.247 "product_name": "Raid Volume", 00:13:14.247 "block_size": 512, 00:13:14.247 "num_blocks": 126976, 00:13:14.247 "uuid": "9cd82e5e-3eb0-4328-b08e-dc558e8e2510", 00:13:14.247 "assigned_rate_limits": { 00:13:14.247 "rw_ios_per_sec": 0, 00:13:14.247 "rw_mbytes_per_sec": 0, 00:13:14.247 "r_mbytes_per_sec": 0, 00:13:14.247 "w_mbytes_per_sec": 0 00:13:14.247 }, 00:13:14.247 "claimed": false, 00:13:14.247 "zoned": false, 00:13:14.247 "supported_io_types": { 00:13:14.247 "read": true, 00:13:14.247 "write": true, 00:13:14.247 "unmap": true, 00:13:14.247 "flush": true, 00:13:14.247 "reset": true, 00:13:14.247 "nvme_admin": false, 00:13:14.247 "nvme_io": false, 00:13:14.247 "nvme_io_md": false, 00:13:14.247 "write_zeroes": true, 00:13:14.247 "zcopy": false, 00:13:14.247 "get_zone_info": false, 00:13:14.247 "zone_management": false, 00:13:14.247 "zone_append": false, 00:13:14.247 "compare": false, 00:13:14.247 "compare_and_write": false, 00:13:14.247 "abort": false, 00:13:14.247 "seek_hole": false, 00:13:14.247 "seek_data": false, 00:13:14.247 "copy": false, 00:13:14.247 "nvme_iov_md": false 00:13:14.247 }, 00:13:14.247 "memory_domains": [ 00:13:14.247 { 00:13:14.247 "dma_device_id": "system", 00:13:14.247 "dma_device_type": 1 00:13:14.247 }, 00:13:14.247 { 00:13:14.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.247 "dma_device_type": 2 00:13:14.247 }, 00:13:14.248 { 00:13:14.248 "dma_device_id": "system", 00:13:14.248 "dma_device_type": 1 00:13:14.248 }, 00:13:14.248 { 00:13:14.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.248 "dma_device_type": 2 00:13:14.248 } 00:13:14.248 ], 00:13:14.248 "driver_specific": { 00:13:14.248 "raid": { 00:13:14.248 "uuid": "9cd82e5e-3eb0-4328-b08e-dc558e8e2510", 00:13:14.248 "strip_size_kb": 64, 00:13:14.248 "state": "online", 00:13:14.248 "raid_level": "raid0", 00:13:14.248 "superblock": true, 00:13:14.248 "num_base_bdevs": 2, 00:13:14.248 "num_base_bdevs_discovered": 2, 00:13:14.248 "num_base_bdevs_operational": 2, 00:13:14.248 "base_bdevs_list": [ 00:13:14.248 { 00:13:14.248 "name": "BaseBdev1", 00:13:14.248 "uuid": "eedefc26-cfb8-4d1d-94dc-21388f45f6cb", 00:13:14.248 "is_configured": true, 00:13:14.248 "data_offset": 2048, 00:13:14.248 "data_size": 63488 00:13:14.248 }, 00:13:14.248 { 00:13:14.248 "name": "BaseBdev2", 00:13:14.248 "uuid": "1fc07845-fe64-4666-8e0b-7c5fa1922f19", 00:13:14.248 "is_configured": true, 00:13:14.248 "data_offset": 2048, 00:13:14.248 "data_size": 63488 00:13:14.248 } 00:13:14.248 ] 00:13:14.248 } 00:13:14.248 } 00:13:14.248 }' 00:13:14.248 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:14.248 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:14.248 BaseBdev2' 00:13:14.248 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.248 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:14.248 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.506 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.506 "name": "BaseBdev1", 00:13:14.506 "aliases": [ 00:13:14.506 "eedefc26-cfb8-4d1d-94dc-21388f45f6cb" 00:13:14.506 ], 00:13:14.506 "product_name": "Malloc disk", 00:13:14.506 "block_size": 512, 00:13:14.506 "num_blocks": 65536, 00:13:14.506 "uuid": "eedefc26-cfb8-4d1d-94dc-21388f45f6cb", 00:13:14.506 "assigned_rate_limits": { 00:13:14.506 "rw_ios_per_sec": 0, 00:13:14.506 "rw_mbytes_per_sec": 0, 00:13:14.506 "r_mbytes_per_sec": 0, 00:13:14.506 "w_mbytes_per_sec": 0 00:13:14.506 }, 00:13:14.506 "claimed": true, 00:13:14.506 "claim_type": "exclusive_write", 00:13:14.506 "zoned": false, 00:13:14.506 "supported_io_types": { 00:13:14.506 "read": true, 00:13:14.506 "write": true, 00:13:14.506 "unmap": true, 00:13:14.506 "flush": true, 00:13:14.506 "reset": true, 00:13:14.506 "nvme_admin": false, 00:13:14.506 "nvme_io": false, 00:13:14.506 "nvme_io_md": false, 00:13:14.506 "write_zeroes": true, 00:13:14.506 "zcopy": true, 00:13:14.506 "get_zone_info": false, 00:13:14.506 "zone_management": false, 00:13:14.506 "zone_append": false, 00:13:14.506 "compare": false, 00:13:14.506 "compare_and_write": false, 00:13:14.506 "abort": true, 00:13:14.506 "seek_hole": false, 00:13:14.506 "seek_data": false, 00:13:14.506 "copy": true, 00:13:14.506 "nvme_iov_md": false 00:13:14.506 }, 00:13:14.506 "memory_domains": [ 00:13:14.506 { 00:13:14.506 "dma_device_id": "system", 00:13:14.506 "dma_device_type": 1 00:13:14.506 }, 00:13:14.506 { 00:13:14.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.506 "dma_device_type": 2 00:13:14.506 } 00:13:14.506 ], 00:13:14.506 "driver_specific": {} 00:13:14.506 }' 00:13:14.506 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.506 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.763 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.763 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.763 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:14.763 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.763 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.763 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.763 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:14.763 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.763 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.021 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.021 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.021 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.021 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:15.021 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.021 "name": "BaseBdev2", 00:13:15.021 "aliases": [ 00:13:15.021 "1fc07845-fe64-4666-8e0b-7c5fa1922f19" 00:13:15.021 ], 00:13:15.021 "product_name": "Malloc disk", 00:13:15.021 "block_size": 512, 00:13:15.021 "num_blocks": 65536, 00:13:15.021 "uuid": "1fc07845-fe64-4666-8e0b-7c5fa1922f19", 00:13:15.021 "assigned_rate_limits": { 00:13:15.021 "rw_ios_per_sec": 0, 00:13:15.021 "rw_mbytes_per_sec": 0, 00:13:15.021 "r_mbytes_per_sec": 0, 00:13:15.021 "w_mbytes_per_sec": 0 00:13:15.021 }, 00:13:15.021 "claimed": true, 00:13:15.021 "claim_type": "exclusive_write", 00:13:15.021 "zoned": false, 00:13:15.021 "supported_io_types": { 00:13:15.021 "read": true, 00:13:15.021 "write": true, 00:13:15.021 "unmap": true, 00:13:15.021 "flush": true, 00:13:15.021 "reset": true, 00:13:15.021 "nvme_admin": false, 00:13:15.021 "nvme_io": false, 00:13:15.021 "nvme_io_md": false, 00:13:15.021 "write_zeroes": true, 00:13:15.021 "zcopy": true, 00:13:15.021 "get_zone_info": false, 00:13:15.021 "zone_management": false, 00:13:15.021 "zone_append": false, 00:13:15.021 "compare": false, 00:13:15.021 "compare_and_write": false, 00:13:15.021 "abort": true, 00:13:15.021 "seek_hole": false, 00:13:15.021 "seek_data": false, 00:13:15.021 "copy": true, 00:13:15.021 "nvme_iov_md": false 00:13:15.021 }, 00:13:15.021 "memory_domains": [ 00:13:15.021 { 00:13:15.021 "dma_device_id": "system", 00:13:15.021 "dma_device_type": 1 00:13:15.021 }, 00:13:15.021 { 00:13:15.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.021 "dma_device_type": 2 00:13:15.021 } 00:13:15.021 ], 00:13:15.021 "driver_specific": {} 00:13:15.021 }' 00:13:15.021 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.021 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.279 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.279 05:41:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.279 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.279 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.279 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.279 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.279 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.279 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.279 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.537 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.537 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:15.795 [2024-07-26 05:41:30.452165] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:15.795 [2024-07-26 05:41:30.452191] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:15.795 [2024-07-26 05:41:30.452233] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:15.795 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:15.795 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:15.795 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:15.795 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.796 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.053 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.053 "name": "Existed_Raid", 00:13:16.053 "uuid": "9cd82e5e-3eb0-4328-b08e-dc558e8e2510", 00:13:16.053 "strip_size_kb": 64, 00:13:16.054 "state": "offline", 00:13:16.054 "raid_level": "raid0", 00:13:16.054 "superblock": true, 00:13:16.054 "num_base_bdevs": 2, 00:13:16.054 "num_base_bdevs_discovered": 1, 00:13:16.054 "num_base_bdevs_operational": 1, 00:13:16.054 "base_bdevs_list": [ 00:13:16.054 { 00:13:16.054 "name": null, 00:13:16.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.054 "is_configured": false, 00:13:16.054 "data_offset": 2048, 00:13:16.054 "data_size": 63488 00:13:16.054 }, 00:13:16.054 { 00:13:16.054 "name": "BaseBdev2", 00:13:16.054 "uuid": "1fc07845-fe64-4666-8e0b-7c5fa1922f19", 00:13:16.054 "is_configured": true, 00:13:16.054 "data_offset": 2048, 00:13:16.054 "data_size": 63488 00:13:16.054 } 00:13:16.054 ] 00:13:16.054 }' 00:13:16.054 05:41:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.054 05:41:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:16.619 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:16.619 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:16.619 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.619 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:16.877 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:16.877 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:16.877 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:17.136 [2024-07-26 05:41:31.796787] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:17.136 [2024-07-26 05:41:31.796834] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2343000 name Existed_Raid, state offline 00:13:17.136 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:17.136 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:17.136 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.136 05:41:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1127766 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1127766 ']' 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1127766 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1127766 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1127766' 00:13:17.703 killing process with pid 1127766 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1127766 00:13:17.703 [2024-07-26 05:41:32.397585] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1127766 00:13:17.703 [2024-07-26 05:41:32.398455] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:17.703 00:13:17.703 real 0m10.577s 00:13:17.703 user 0m18.828s 00:13:17.703 sys 0m1.938s 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:17.703 05:41:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.703 ************************************ 00:13:17.703 END TEST raid_state_function_test_sb 00:13:17.703 ************************************ 00:13:17.962 05:41:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:17.962 05:41:32 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:13:17.962 05:41:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:17.962 05:41:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:17.962 05:41:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:17.962 ************************************ 00:13:17.962 START TEST raid_superblock_test 00:13:17.962 ************************************ 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1129396 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1129396 /var/tmp/spdk-raid.sock 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1129396 ']' 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:17.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:17.962 05:41:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.962 [2024-07-26 05:41:32.746928] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:13:17.962 [2024-07-26 05:41:32.746988] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1129396 ] 00:13:17.962 [2024-07-26 05:41:32.865103] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:18.221 [2024-07-26 05:41:32.963434] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:18.221 [2024-07-26 05:41:33.026552] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.221 [2024-07-26 05:41:33.026594] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:18.787 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:19.045 malloc1 00:13:19.045 05:41:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:19.304 [2024-07-26 05:41:34.059657] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:19.304 [2024-07-26 05:41:34.059705] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:19.304 [2024-07-26 05:41:34.059727] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfcb570 00:13:19.304 [2024-07-26 05:41:34.059739] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:19.304 [2024-07-26 05:41:34.061459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:19.304 [2024-07-26 05:41:34.061488] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:19.304 pt1 00:13:19.304 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:19.304 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:19.304 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:19.304 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:19.304 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:19.304 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:19.304 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:19.304 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:19.304 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:19.562 malloc2 00:13:19.562 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:19.854 [2024-07-26 05:41:34.493524] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:19.854 [2024-07-26 05:41:34.493568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:19.854 [2024-07-26 05:41:34.493587] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfcc970 00:13:19.854 [2024-07-26 05:41:34.493599] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:19.854 [2024-07-26 05:41:34.495262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:19.854 [2024-07-26 05:41:34.495290] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:19.854 pt2 00:13:19.854 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:19.854 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:19.854 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:13:19.854 [2024-07-26 05:41:34.738192] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:19.854 [2024-07-26 05:41:34.739538] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:19.854 [2024-07-26 05:41:34.739693] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x116f270 00:13:19.854 [2024-07-26 05:41:34.739707] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:19.855 [2024-07-26 05:41:34.739910] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1164c10 00:13:19.855 [2024-07-26 05:41:34.740057] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116f270 00:13:19.855 [2024-07-26 05:41:34.740067] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x116f270 00:13:19.855 [2024-07-26 05:41:34.740174] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:19.855 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.112 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.112 05:41:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:20.112 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.112 "name": "raid_bdev1", 00:13:20.112 "uuid": "42e978e1-7f30-4be2-8cd6-dbe1e8f3287c", 00:13:20.112 "strip_size_kb": 64, 00:13:20.112 "state": "online", 00:13:20.112 "raid_level": "raid0", 00:13:20.112 "superblock": true, 00:13:20.112 "num_base_bdevs": 2, 00:13:20.112 "num_base_bdevs_discovered": 2, 00:13:20.112 "num_base_bdevs_operational": 2, 00:13:20.112 "base_bdevs_list": [ 00:13:20.112 { 00:13:20.112 "name": "pt1", 00:13:20.112 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:20.112 "is_configured": true, 00:13:20.112 "data_offset": 2048, 00:13:20.112 "data_size": 63488 00:13:20.112 }, 00:13:20.112 { 00:13:20.112 "name": "pt2", 00:13:20.112 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:20.112 "is_configured": true, 00:13:20.112 "data_offset": 2048, 00:13:20.112 "data_size": 63488 00:13:20.112 } 00:13:20.112 ] 00:13:20.112 }' 00:13:20.112 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.112 05:41:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:20.680 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:20.680 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:20.680 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:20.680 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:20.680 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:20.680 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:20.680 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:20.680 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:20.939 [2024-07-26 05:41:35.809222] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:20.939 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:20.939 "name": "raid_bdev1", 00:13:20.939 "aliases": [ 00:13:20.940 "42e978e1-7f30-4be2-8cd6-dbe1e8f3287c" 00:13:20.940 ], 00:13:20.940 "product_name": "Raid Volume", 00:13:20.940 "block_size": 512, 00:13:20.940 "num_blocks": 126976, 00:13:20.940 "uuid": "42e978e1-7f30-4be2-8cd6-dbe1e8f3287c", 00:13:20.940 "assigned_rate_limits": { 00:13:20.940 "rw_ios_per_sec": 0, 00:13:20.940 "rw_mbytes_per_sec": 0, 00:13:20.940 "r_mbytes_per_sec": 0, 00:13:20.940 "w_mbytes_per_sec": 0 00:13:20.940 }, 00:13:20.940 "claimed": false, 00:13:20.940 "zoned": false, 00:13:20.940 "supported_io_types": { 00:13:20.940 "read": true, 00:13:20.940 "write": true, 00:13:20.940 "unmap": true, 00:13:20.940 "flush": true, 00:13:20.940 "reset": true, 00:13:20.940 "nvme_admin": false, 00:13:20.940 "nvme_io": false, 00:13:20.940 "nvme_io_md": false, 00:13:20.940 "write_zeroes": true, 00:13:20.940 "zcopy": false, 00:13:20.940 "get_zone_info": false, 00:13:20.940 "zone_management": false, 00:13:20.940 "zone_append": false, 00:13:20.940 "compare": false, 00:13:20.940 "compare_and_write": false, 00:13:20.940 "abort": false, 00:13:20.940 "seek_hole": false, 00:13:20.940 "seek_data": false, 00:13:20.940 "copy": false, 00:13:20.940 "nvme_iov_md": false 00:13:20.940 }, 00:13:20.940 "memory_domains": [ 00:13:20.940 { 00:13:20.940 "dma_device_id": "system", 00:13:20.940 "dma_device_type": 1 00:13:20.940 }, 00:13:20.940 { 00:13:20.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.940 "dma_device_type": 2 00:13:20.940 }, 00:13:20.940 { 00:13:20.940 "dma_device_id": "system", 00:13:20.940 "dma_device_type": 1 00:13:20.940 }, 00:13:20.940 { 00:13:20.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.940 "dma_device_type": 2 00:13:20.940 } 00:13:20.940 ], 00:13:20.940 "driver_specific": { 00:13:20.940 "raid": { 00:13:20.940 "uuid": "42e978e1-7f30-4be2-8cd6-dbe1e8f3287c", 00:13:20.940 "strip_size_kb": 64, 00:13:20.940 "state": "online", 00:13:20.940 "raid_level": "raid0", 00:13:20.940 "superblock": true, 00:13:20.940 "num_base_bdevs": 2, 00:13:20.940 "num_base_bdevs_discovered": 2, 00:13:20.940 "num_base_bdevs_operational": 2, 00:13:20.940 "base_bdevs_list": [ 00:13:20.940 { 00:13:20.940 "name": "pt1", 00:13:20.940 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:20.940 "is_configured": true, 00:13:20.940 "data_offset": 2048, 00:13:20.940 "data_size": 63488 00:13:20.940 }, 00:13:20.940 { 00:13:20.940 "name": "pt2", 00:13:20.940 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:20.940 "is_configured": true, 00:13:20.940 "data_offset": 2048, 00:13:20.940 "data_size": 63488 00:13:20.940 } 00:13:20.940 ] 00:13:20.940 } 00:13:20.940 } 00:13:20.940 }' 00:13:20.940 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:21.199 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:21.199 pt2' 00:13:21.199 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:21.199 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:21.199 05:41:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:21.458 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:21.458 "name": "pt1", 00:13:21.458 "aliases": [ 00:13:21.458 "00000000-0000-0000-0000-000000000001" 00:13:21.458 ], 00:13:21.458 "product_name": "passthru", 00:13:21.458 "block_size": 512, 00:13:21.458 "num_blocks": 65536, 00:13:21.458 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:21.458 "assigned_rate_limits": { 00:13:21.458 "rw_ios_per_sec": 0, 00:13:21.458 "rw_mbytes_per_sec": 0, 00:13:21.458 "r_mbytes_per_sec": 0, 00:13:21.458 "w_mbytes_per_sec": 0 00:13:21.458 }, 00:13:21.458 "claimed": true, 00:13:21.458 "claim_type": "exclusive_write", 00:13:21.458 "zoned": false, 00:13:21.458 "supported_io_types": { 00:13:21.458 "read": true, 00:13:21.458 "write": true, 00:13:21.458 "unmap": true, 00:13:21.458 "flush": true, 00:13:21.458 "reset": true, 00:13:21.458 "nvme_admin": false, 00:13:21.458 "nvme_io": false, 00:13:21.458 "nvme_io_md": false, 00:13:21.458 "write_zeroes": true, 00:13:21.458 "zcopy": true, 00:13:21.458 "get_zone_info": false, 00:13:21.458 "zone_management": false, 00:13:21.458 "zone_append": false, 00:13:21.458 "compare": false, 00:13:21.458 "compare_and_write": false, 00:13:21.458 "abort": true, 00:13:21.458 "seek_hole": false, 00:13:21.458 "seek_data": false, 00:13:21.458 "copy": true, 00:13:21.458 "nvme_iov_md": false 00:13:21.458 }, 00:13:21.458 "memory_domains": [ 00:13:21.458 { 00:13:21.458 "dma_device_id": "system", 00:13:21.458 "dma_device_type": 1 00:13:21.458 }, 00:13:21.458 { 00:13:21.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.458 "dma_device_type": 2 00:13:21.458 } 00:13:21.458 ], 00:13:21.458 "driver_specific": { 00:13:21.458 "passthru": { 00:13:21.458 "name": "pt1", 00:13:21.458 "base_bdev_name": "malloc1" 00:13:21.458 } 00:13:21.458 } 00:13:21.458 }' 00:13:21.458 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.458 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.458 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:21.458 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.458 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.458 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:21.458 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.718 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:21.718 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:21.718 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.718 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:21.718 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:21.718 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:21.718 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:21.718 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:21.979 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:21.979 "name": "pt2", 00:13:21.979 "aliases": [ 00:13:21.979 "00000000-0000-0000-0000-000000000002" 00:13:21.979 ], 00:13:21.979 "product_name": "passthru", 00:13:21.979 "block_size": 512, 00:13:21.979 "num_blocks": 65536, 00:13:21.979 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:21.979 "assigned_rate_limits": { 00:13:21.979 "rw_ios_per_sec": 0, 00:13:21.979 "rw_mbytes_per_sec": 0, 00:13:21.979 "r_mbytes_per_sec": 0, 00:13:21.979 "w_mbytes_per_sec": 0 00:13:21.979 }, 00:13:21.979 "claimed": true, 00:13:21.979 "claim_type": "exclusive_write", 00:13:21.979 "zoned": false, 00:13:21.979 "supported_io_types": { 00:13:21.979 "read": true, 00:13:21.979 "write": true, 00:13:21.979 "unmap": true, 00:13:21.979 "flush": true, 00:13:21.979 "reset": true, 00:13:21.979 "nvme_admin": false, 00:13:21.979 "nvme_io": false, 00:13:21.979 "nvme_io_md": false, 00:13:21.979 "write_zeroes": true, 00:13:21.979 "zcopy": true, 00:13:21.979 "get_zone_info": false, 00:13:21.979 "zone_management": false, 00:13:21.979 "zone_append": false, 00:13:21.979 "compare": false, 00:13:21.979 "compare_and_write": false, 00:13:21.979 "abort": true, 00:13:21.979 "seek_hole": false, 00:13:21.980 "seek_data": false, 00:13:21.980 "copy": true, 00:13:21.980 "nvme_iov_md": false 00:13:21.980 }, 00:13:21.980 "memory_domains": [ 00:13:21.980 { 00:13:21.980 "dma_device_id": "system", 00:13:21.980 "dma_device_type": 1 00:13:21.980 }, 00:13:21.980 { 00:13:21.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.980 "dma_device_type": 2 00:13:21.980 } 00:13:21.980 ], 00:13:21.980 "driver_specific": { 00:13:21.980 "passthru": { 00:13:21.980 "name": "pt2", 00:13:21.980 "base_bdev_name": "malloc2" 00:13:21.980 } 00:13:21.980 } 00:13:21.980 }' 00:13:21.980 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.980 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:21.980 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:21.980 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:21.980 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:22.239 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:22.239 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:22.239 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:22.239 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:22.239 05:41:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:22.239 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:22.239 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:22.239 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:22.239 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:22.498 [2024-07-26 05:41:37.317219] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:22.498 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=42e978e1-7f30-4be2-8cd6-dbe1e8f3287c 00:13:22.498 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 42e978e1-7f30-4be2-8cd6-dbe1e8f3287c ']' 00:13:22.498 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:22.758 [2024-07-26 05:41:37.557616] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:22.758 [2024-07-26 05:41:37.557636] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:22.758 [2024-07-26 05:41:37.557706] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:22.758 [2024-07-26 05:41:37.557751] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:22.758 [2024-07-26 05:41:37.557763] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116f270 name raid_bdev1, state offline 00:13:22.758 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.758 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:23.017 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:23.017 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:23.017 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:23.017 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:23.017 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:23.017 05:41:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:23.276 05:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:23.276 05:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:23.843 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:13:24.102 [2024-07-26 05:41:38.885077] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:24.102 [2024-07-26 05:41:38.886465] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:24.102 [2024-07-26 05:41:38.886529] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:24.102 [2024-07-26 05:41:38.886569] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:24.102 [2024-07-26 05:41:38.886588] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:24.102 [2024-07-26 05:41:38.886597] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116eff0 name raid_bdev1, state configuring 00:13:24.102 request: 00:13:24.102 { 00:13:24.102 "name": "raid_bdev1", 00:13:24.102 "raid_level": "raid0", 00:13:24.102 "base_bdevs": [ 00:13:24.102 "malloc1", 00:13:24.102 "malloc2" 00:13:24.102 ], 00:13:24.102 "strip_size_kb": 64, 00:13:24.102 "superblock": false, 00:13:24.102 "method": "bdev_raid_create", 00:13:24.102 "req_id": 1 00:13:24.102 } 00:13:24.102 Got JSON-RPC error response 00:13:24.102 response: 00:13:24.102 { 00:13:24.102 "code": -17, 00:13:24.102 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:24.102 } 00:13:24.102 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:24.102 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:24.102 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:24.102 05:41:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:24.102 05:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.102 05:41:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:24.361 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:24.361 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:24.361 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:24.361 [2024-07-26 05:41:39.217904] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:24.361 [2024-07-26 05:41:39.217944] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:24.361 [2024-07-26 05:41:39.217965] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfcb7a0 00:13:24.361 [2024-07-26 05:41:39.217977] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:24.361 [2024-07-26 05:41:39.219592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:24.361 [2024-07-26 05:41:39.219619] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:24.361 [2024-07-26 05:41:39.219693] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:24.362 [2024-07-26 05:41:39.219719] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:24.362 pt1 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.362 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:24.620 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.620 "name": "raid_bdev1", 00:13:24.620 "uuid": "42e978e1-7f30-4be2-8cd6-dbe1e8f3287c", 00:13:24.620 "strip_size_kb": 64, 00:13:24.620 "state": "configuring", 00:13:24.620 "raid_level": "raid0", 00:13:24.620 "superblock": true, 00:13:24.620 "num_base_bdevs": 2, 00:13:24.620 "num_base_bdevs_discovered": 1, 00:13:24.620 "num_base_bdevs_operational": 2, 00:13:24.620 "base_bdevs_list": [ 00:13:24.620 { 00:13:24.620 "name": "pt1", 00:13:24.620 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:24.620 "is_configured": true, 00:13:24.620 "data_offset": 2048, 00:13:24.620 "data_size": 63488 00:13:24.620 }, 00:13:24.620 { 00:13:24.620 "name": null, 00:13:24.620 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:24.620 "is_configured": false, 00:13:24.620 "data_offset": 2048, 00:13:24.620 "data_size": 63488 00:13:24.620 } 00:13:24.620 ] 00:13:24.620 }' 00:13:24.620 05:41:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.620 05:41:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.189 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:13:25.189 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:25.189 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:25.189 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:25.447 [2024-07-26 05:41:40.256674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:25.447 [2024-07-26 05:41:40.256725] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:25.447 [2024-07-26 05:41:40.256743] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1165820 00:13:25.447 [2024-07-26 05:41:40.256756] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:25.447 [2024-07-26 05:41:40.257099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:25.447 [2024-07-26 05:41:40.257116] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:25.447 [2024-07-26 05:41:40.257178] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:25.447 [2024-07-26 05:41:40.257197] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:25.447 [2024-07-26 05:41:40.257288] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc1ec0 00:13:25.447 [2024-07-26 05:41:40.257299] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:25.447 [2024-07-26 05:41:40.257470] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc4530 00:13:25.447 [2024-07-26 05:41:40.257588] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc1ec0 00:13:25.447 [2024-07-26 05:41:40.257597] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc1ec0 00:13:25.447 [2024-07-26 05:41:40.257715] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:25.447 pt2 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.447 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:25.706 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.706 "name": "raid_bdev1", 00:13:25.706 "uuid": "42e978e1-7f30-4be2-8cd6-dbe1e8f3287c", 00:13:25.706 "strip_size_kb": 64, 00:13:25.706 "state": "online", 00:13:25.706 "raid_level": "raid0", 00:13:25.706 "superblock": true, 00:13:25.706 "num_base_bdevs": 2, 00:13:25.706 "num_base_bdevs_discovered": 2, 00:13:25.706 "num_base_bdevs_operational": 2, 00:13:25.706 "base_bdevs_list": [ 00:13:25.706 { 00:13:25.706 "name": "pt1", 00:13:25.706 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:25.706 "is_configured": true, 00:13:25.706 "data_offset": 2048, 00:13:25.706 "data_size": 63488 00:13:25.706 }, 00:13:25.706 { 00:13:25.706 "name": "pt2", 00:13:25.706 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:25.706 "is_configured": true, 00:13:25.706 "data_offset": 2048, 00:13:25.706 "data_size": 63488 00:13:25.706 } 00:13:25.706 ] 00:13:25.706 }' 00:13:25.706 05:41:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.706 05:41:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.273 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:26.273 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:26.273 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:26.273 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:26.273 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:26.273 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:26.273 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:26.273 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:26.531 [2024-07-26 05:41:41.279613] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:26.531 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:26.531 "name": "raid_bdev1", 00:13:26.531 "aliases": [ 00:13:26.531 "42e978e1-7f30-4be2-8cd6-dbe1e8f3287c" 00:13:26.531 ], 00:13:26.531 "product_name": "Raid Volume", 00:13:26.531 "block_size": 512, 00:13:26.531 "num_blocks": 126976, 00:13:26.531 "uuid": "42e978e1-7f30-4be2-8cd6-dbe1e8f3287c", 00:13:26.531 "assigned_rate_limits": { 00:13:26.531 "rw_ios_per_sec": 0, 00:13:26.531 "rw_mbytes_per_sec": 0, 00:13:26.531 "r_mbytes_per_sec": 0, 00:13:26.531 "w_mbytes_per_sec": 0 00:13:26.531 }, 00:13:26.531 "claimed": false, 00:13:26.531 "zoned": false, 00:13:26.531 "supported_io_types": { 00:13:26.531 "read": true, 00:13:26.531 "write": true, 00:13:26.531 "unmap": true, 00:13:26.531 "flush": true, 00:13:26.531 "reset": true, 00:13:26.531 "nvme_admin": false, 00:13:26.531 "nvme_io": false, 00:13:26.531 "nvme_io_md": false, 00:13:26.531 "write_zeroes": true, 00:13:26.531 "zcopy": false, 00:13:26.531 "get_zone_info": false, 00:13:26.531 "zone_management": false, 00:13:26.531 "zone_append": false, 00:13:26.531 "compare": false, 00:13:26.531 "compare_and_write": false, 00:13:26.531 "abort": false, 00:13:26.531 "seek_hole": false, 00:13:26.531 "seek_data": false, 00:13:26.531 "copy": false, 00:13:26.531 "nvme_iov_md": false 00:13:26.531 }, 00:13:26.531 "memory_domains": [ 00:13:26.531 { 00:13:26.531 "dma_device_id": "system", 00:13:26.531 "dma_device_type": 1 00:13:26.531 }, 00:13:26.531 { 00:13:26.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.531 "dma_device_type": 2 00:13:26.531 }, 00:13:26.531 { 00:13:26.531 "dma_device_id": "system", 00:13:26.531 "dma_device_type": 1 00:13:26.531 }, 00:13:26.531 { 00:13:26.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.531 "dma_device_type": 2 00:13:26.531 } 00:13:26.531 ], 00:13:26.531 "driver_specific": { 00:13:26.531 "raid": { 00:13:26.531 "uuid": "42e978e1-7f30-4be2-8cd6-dbe1e8f3287c", 00:13:26.531 "strip_size_kb": 64, 00:13:26.531 "state": "online", 00:13:26.531 "raid_level": "raid0", 00:13:26.531 "superblock": true, 00:13:26.531 "num_base_bdevs": 2, 00:13:26.531 "num_base_bdevs_discovered": 2, 00:13:26.531 "num_base_bdevs_operational": 2, 00:13:26.531 "base_bdevs_list": [ 00:13:26.531 { 00:13:26.531 "name": "pt1", 00:13:26.531 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:26.531 "is_configured": true, 00:13:26.531 "data_offset": 2048, 00:13:26.531 "data_size": 63488 00:13:26.531 }, 00:13:26.531 { 00:13:26.531 "name": "pt2", 00:13:26.531 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:26.531 "is_configured": true, 00:13:26.531 "data_offset": 2048, 00:13:26.531 "data_size": 63488 00:13:26.531 } 00:13:26.531 ] 00:13:26.531 } 00:13:26.531 } 00:13:26.531 }' 00:13:26.531 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:26.531 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:26.531 pt2' 00:13:26.532 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:26.532 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:26.532 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:26.790 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:26.790 "name": "pt1", 00:13:26.790 "aliases": [ 00:13:26.790 "00000000-0000-0000-0000-000000000001" 00:13:26.790 ], 00:13:26.790 "product_name": "passthru", 00:13:26.790 "block_size": 512, 00:13:26.790 "num_blocks": 65536, 00:13:26.790 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:26.790 "assigned_rate_limits": { 00:13:26.790 "rw_ios_per_sec": 0, 00:13:26.790 "rw_mbytes_per_sec": 0, 00:13:26.790 "r_mbytes_per_sec": 0, 00:13:26.790 "w_mbytes_per_sec": 0 00:13:26.790 }, 00:13:26.790 "claimed": true, 00:13:26.790 "claim_type": "exclusive_write", 00:13:26.790 "zoned": false, 00:13:26.790 "supported_io_types": { 00:13:26.790 "read": true, 00:13:26.790 "write": true, 00:13:26.790 "unmap": true, 00:13:26.790 "flush": true, 00:13:26.790 "reset": true, 00:13:26.790 "nvme_admin": false, 00:13:26.790 "nvme_io": false, 00:13:26.790 "nvme_io_md": false, 00:13:26.790 "write_zeroes": true, 00:13:26.790 "zcopy": true, 00:13:26.790 "get_zone_info": false, 00:13:26.790 "zone_management": false, 00:13:26.790 "zone_append": false, 00:13:26.790 "compare": false, 00:13:26.790 "compare_and_write": false, 00:13:26.790 "abort": true, 00:13:26.790 "seek_hole": false, 00:13:26.790 "seek_data": false, 00:13:26.790 "copy": true, 00:13:26.790 "nvme_iov_md": false 00:13:26.790 }, 00:13:26.790 "memory_domains": [ 00:13:26.790 { 00:13:26.790 "dma_device_id": "system", 00:13:26.790 "dma_device_type": 1 00:13:26.790 }, 00:13:26.790 { 00:13:26.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.790 "dma_device_type": 2 00:13:26.790 } 00:13:26.790 ], 00:13:26.790 "driver_specific": { 00:13:26.790 "passthru": { 00:13:26.790 "name": "pt1", 00:13:26.790 "base_bdev_name": "malloc1" 00:13:26.790 } 00:13:26.790 } 00:13:26.790 }' 00:13:26.790 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.790 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.790 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:26.790 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.790 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.790 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:26.790 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.049 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.049 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:27.049 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.049 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.049 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.049 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:27.049 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:27.049 05:41:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:27.307 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:27.307 "name": "pt2", 00:13:27.307 "aliases": [ 00:13:27.307 "00000000-0000-0000-0000-000000000002" 00:13:27.307 ], 00:13:27.307 "product_name": "passthru", 00:13:27.307 "block_size": 512, 00:13:27.307 "num_blocks": 65536, 00:13:27.307 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:27.307 "assigned_rate_limits": { 00:13:27.307 "rw_ios_per_sec": 0, 00:13:27.307 "rw_mbytes_per_sec": 0, 00:13:27.307 "r_mbytes_per_sec": 0, 00:13:27.307 "w_mbytes_per_sec": 0 00:13:27.307 }, 00:13:27.307 "claimed": true, 00:13:27.307 "claim_type": "exclusive_write", 00:13:27.307 "zoned": false, 00:13:27.307 "supported_io_types": { 00:13:27.307 "read": true, 00:13:27.307 "write": true, 00:13:27.307 "unmap": true, 00:13:27.307 "flush": true, 00:13:27.307 "reset": true, 00:13:27.307 "nvme_admin": false, 00:13:27.307 "nvme_io": false, 00:13:27.307 "nvme_io_md": false, 00:13:27.307 "write_zeroes": true, 00:13:27.307 "zcopy": true, 00:13:27.307 "get_zone_info": false, 00:13:27.307 "zone_management": false, 00:13:27.307 "zone_append": false, 00:13:27.307 "compare": false, 00:13:27.307 "compare_and_write": false, 00:13:27.307 "abort": true, 00:13:27.307 "seek_hole": false, 00:13:27.307 "seek_data": false, 00:13:27.307 "copy": true, 00:13:27.307 "nvme_iov_md": false 00:13:27.307 }, 00:13:27.307 "memory_domains": [ 00:13:27.307 { 00:13:27.308 "dma_device_id": "system", 00:13:27.308 "dma_device_type": 1 00:13:27.308 }, 00:13:27.308 { 00:13:27.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.308 "dma_device_type": 2 00:13:27.308 } 00:13:27.308 ], 00:13:27.308 "driver_specific": { 00:13:27.308 "passthru": { 00:13:27.308 "name": "pt2", 00:13:27.308 "base_bdev_name": "malloc2" 00:13:27.308 } 00:13:27.308 } 00:13:27.308 }' 00:13:27.308 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.308 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.308 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:27.308 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.566 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.566 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:27.566 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.566 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.566 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:27.566 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.566 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:27.825 [2024-07-26 05:41:42.703375] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 42e978e1-7f30-4be2-8cd6-dbe1e8f3287c '!=' 42e978e1-7f30-4be2-8cd6-dbe1e8f3287c ']' 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1129396 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1129396 ']' 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1129396 00:13:27.825 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:28.084 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:28.085 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1129396 00:13:28.085 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:28.085 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:28.085 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1129396' 00:13:28.085 killing process with pid 1129396 00:13:28.085 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1129396 00:13:28.085 [2024-07-26 05:41:42.775457] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:28.085 [2024-07-26 05:41:42.775512] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:28.085 [2024-07-26 05:41:42.775556] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:28.085 [2024-07-26 05:41:42.775567] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc1ec0 name raid_bdev1, state offline 00:13:28.085 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1129396 00:13:28.085 [2024-07-26 05:41:42.791660] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:28.344 05:41:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:28.344 00:13:28.344 real 0m10.311s 00:13:28.344 user 0m18.461s 00:13:28.344 sys 0m1.883s 00:13:28.344 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:28.344 05:41:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.344 ************************************ 00:13:28.344 END TEST raid_superblock_test 00:13:28.344 ************************************ 00:13:28.344 05:41:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:28.344 05:41:43 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:13:28.344 05:41:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:28.344 05:41:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:28.344 05:41:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:28.344 ************************************ 00:13:28.344 START TEST raid_read_error_test 00:13:28.344 ************************************ 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.7PGNxO5Ame 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1130971 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1130971 /var/tmp/spdk-raid.sock 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1130971 ']' 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:28.344 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:28.344 05:41:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.344 [2024-07-26 05:41:43.143469] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:13:28.344 [2024-07-26 05:41:43.143531] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1130971 ] 00:13:28.602 [2024-07-26 05:41:43.274060] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:28.602 [2024-07-26 05:41:43.380570] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:28.602 [2024-07-26 05:41:43.450101] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:28.602 [2024-07-26 05:41:43.450139] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:29.538 05:41:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:29.538 05:41:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:29.538 05:41:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:29.538 05:41:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:29.538 BaseBdev1_malloc 00:13:29.538 05:41:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:29.798 true 00:13:29.798 05:41:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:30.057 [2024-07-26 05:41:44.797029] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:30.057 [2024-07-26 05:41:44.797071] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:30.057 [2024-07-26 05:41:44.797093] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xee40d0 00:13:30.057 [2024-07-26 05:41:44.797105] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:30.057 [2024-07-26 05:41:44.799018] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:30.057 [2024-07-26 05:41:44.799046] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:30.057 BaseBdev1 00:13:30.057 05:41:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:30.057 05:41:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:30.316 BaseBdev2_malloc 00:13:30.316 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:30.575 true 00:13:30.575 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:30.834 [2024-07-26 05:41:45.504716] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:30.834 [2024-07-26 05:41:45.504758] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:30.834 [2024-07-26 05:41:45.504779] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xee8910 00:13:30.834 [2024-07-26 05:41:45.504791] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:30.834 [2024-07-26 05:41:45.506380] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:30.834 [2024-07-26 05:41:45.506407] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:30.834 BaseBdev2 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:30.834 [2024-07-26 05:41:45.669182] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:30.834 [2024-07-26 05:41:45.670494] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:30.834 [2024-07-26 05:41:45.670699] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xeea320 00:13:30.834 [2024-07-26 05:41:45.670713] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:30.834 [2024-07-26 05:41:45.670906] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xee9270 00:13:30.834 [2024-07-26 05:41:45.671050] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xeea320 00:13:30.834 [2024-07-26 05:41:45.671060] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xeea320 00:13:30.834 [2024-07-26 05:41:45.671163] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.834 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:31.105 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.105 "name": "raid_bdev1", 00:13:31.105 "uuid": "c0aabc79-cdb3-47d3-864d-4b6801c164ab", 00:13:31.105 "strip_size_kb": 64, 00:13:31.105 "state": "online", 00:13:31.105 "raid_level": "raid0", 00:13:31.105 "superblock": true, 00:13:31.105 "num_base_bdevs": 2, 00:13:31.105 "num_base_bdevs_discovered": 2, 00:13:31.105 "num_base_bdevs_operational": 2, 00:13:31.105 "base_bdevs_list": [ 00:13:31.105 { 00:13:31.105 "name": "BaseBdev1", 00:13:31.105 "uuid": "5f6b96bd-bb65-55bb-a36b-45d03fb34d0a", 00:13:31.105 "is_configured": true, 00:13:31.105 "data_offset": 2048, 00:13:31.105 "data_size": 63488 00:13:31.105 }, 00:13:31.105 { 00:13:31.105 "name": "BaseBdev2", 00:13:31.105 "uuid": "6e73a1a8-c555-5525-9080-916a474de2c0", 00:13:31.105 "is_configured": true, 00:13:31.105 "data_offset": 2048, 00:13:31.105 "data_size": 63488 00:13:31.105 } 00:13:31.105 ] 00:13:31.105 }' 00:13:31.106 05:41:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.106 05:41:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.681 05:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:31.681 05:41:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:31.940 [2024-07-26 05:41:46.648043] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xee59b0 00:13:32.877 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:33.136 05:41:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.395 05:41:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:33.395 "name": "raid_bdev1", 00:13:33.395 "uuid": "c0aabc79-cdb3-47d3-864d-4b6801c164ab", 00:13:33.395 "strip_size_kb": 64, 00:13:33.395 "state": "online", 00:13:33.395 "raid_level": "raid0", 00:13:33.395 "superblock": true, 00:13:33.395 "num_base_bdevs": 2, 00:13:33.395 "num_base_bdevs_discovered": 2, 00:13:33.395 "num_base_bdevs_operational": 2, 00:13:33.395 "base_bdevs_list": [ 00:13:33.395 { 00:13:33.395 "name": "BaseBdev1", 00:13:33.395 "uuid": "5f6b96bd-bb65-55bb-a36b-45d03fb34d0a", 00:13:33.395 "is_configured": true, 00:13:33.395 "data_offset": 2048, 00:13:33.395 "data_size": 63488 00:13:33.395 }, 00:13:33.395 { 00:13:33.395 "name": "BaseBdev2", 00:13:33.395 "uuid": "6e73a1a8-c555-5525-9080-916a474de2c0", 00:13:33.395 "is_configured": true, 00:13:33.395 "data_offset": 2048, 00:13:33.395 "data_size": 63488 00:13:33.395 } 00:13:33.395 ] 00:13:33.395 }' 00:13:33.395 05:41:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:33.395 05:41:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.345 05:41:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:34.345 [2024-07-26 05:41:49.123947] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:34.345 [2024-07-26 05:41:49.123988] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:34.345 [2024-07-26 05:41:49.127156] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:34.345 [2024-07-26 05:41:49.127187] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:34.345 [2024-07-26 05:41:49.127215] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:34.345 [2024-07-26 05:41:49.127226] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeea320 name raid_bdev1, state offline 00:13:34.345 0 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1130971 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1130971 ']' 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1130971 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1130971 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1130971' 00:13:34.345 killing process with pid 1130971 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1130971 00:13:34.345 [2024-07-26 05:41:49.191305] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:34.345 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1130971 00:13:34.345 [2024-07-26 05:41:49.201894] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.7PGNxO5Ame 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.41 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.41 != \0\.\0\0 ]] 00:13:34.620 00:13:34.620 real 0m6.350s 00:13:34.620 user 0m10.060s 00:13:34.620 sys 0m1.062s 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:34.620 05:41:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.620 ************************************ 00:13:34.620 END TEST raid_read_error_test 00:13:34.620 ************************************ 00:13:34.620 05:41:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:34.620 05:41:49 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:13:34.620 05:41:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:34.620 05:41:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:34.620 05:41:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:34.620 ************************************ 00:13:34.620 START TEST raid_write_error_test 00:13:34.620 ************************************ 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.HacbyPRyas 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1131843 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1131843 /var/tmp/spdk-raid.sock 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1131843 ']' 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:34.620 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.620 05:41:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:34.880 [2024-07-26 05:41:49.576291] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:13:34.880 [2024-07-26 05:41:49.576359] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1131843 ] 00:13:34.880 [2024-07-26 05:41:49.708520] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:35.139 [2024-07-26 05:41:49.816430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:35.139 [2024-07-26 05:41:49.888323] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:35.139 [2024-07-26 05:41:49.888363] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:35.706 05:41:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:35.706 05:41:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:35.706 05:41:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:35.706 05:41:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:36.274 BaseBdev1_malloc 00:13:36.274 05:41:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:36.533 true 00:13:36.533 05:41:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:36.792 [2024-07-26 05:41:51.500290] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:36.792 [2024-07-26 05:41:51.500334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:36.792 [2024-07-26 05:41:51.500355] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b50d0 00:13:36.792 [2024-07-26 05:41:51.500367] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:36.792 [2024-07-26 05:41:51.502180] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:36.792 [2024-07-26 05:41:51.502209] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:36.792 BaseBdev1 00:13:36.792 05:41:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:36.792 05:41:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:37.051 BaseBdev2_malloc 00:13:37.051 05:41:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:37.310 true 00:13:37.310 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:37.569 [2024-07-26 05:41:52.258862] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:37.569 [2024-07-26 05:41:52.258910] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:37.569 [2024-07-26 05:41:52.258932] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b9910 00:13:37.569 [2024-07-26 05:41:52.258945] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:37.569 [2024-07-26 05:41:52.260537] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:37.569 [2024-07-26 05:41:52.260566] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:37.569 BaseBdev2 00:13:37.569 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:37.829 [2024-07-26 05:41:52.503538] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:37.829 [2024-07-26 05:41:52.504916] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:37.829 [2024-07-26 05:41:52.505108] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12bb320 00:13:37.829 [2024-07-26 05:41:52.505121] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:37.829 [2024-07-26 05:41:52.505322] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12ba270 00:13:37.829 [2024-07-26 05:41:52.505468] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12bb320 00:13:37.829 [2024-07-26 05:41:52.505478] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12bb320 00:13:37.829 [2024-07-26 05:41:52.505581] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.829 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:38.088 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.088 "name": "raid_bdev1", 00:13:38.088 "uuid": "5f3cb20b-78ab-4883-b371-b2c26118367e", 00:13:38.088 "strip_size_kb": 64, 00:13:38.088 "state": "online", 00:13:38.088 "raid_level": "raid0", 00:13:38.088 "superblock": true, 00:13:38.088 "num_base_bdevs": 2, 00:13:38.088 "num_base_bdevs_discovered": 2, 00:13:38.088 "num_base_bdevs_operational": 2, 00:13:38.088 "base_bdevs_list": [ 00:13:38.088 { 00:13:38.088 "name": "BaseBdev1", 00:13:38.088 "uuid": "da4f33ef-a3be-5b44-bbf7-e9c237bd0eed", 00:13:38.088 "is_configured": true, 00:13:38.088 "data_offset": 2048, 00:13:38.088 "data_size": 63488 00:13:38.088 }, 00:13:38.088 { 00:13:38.088 "name": "BaseBdev2", 00:13:38.088 "uuid": "5e4f4042-e99b-5998-96ce-6f4fe846e589", 00:13:38.088 "is_configured": true, 00:13:38.088 "data_offset": 2048, 00:13:38.088 "data_size": 63488 00:13:38.088 } 00:13:38.088 ] 00:13:38.088 }' 00:13:38.088 05:41:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.088 05:41:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.654 05:41:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:38.654 05:41:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:38.913 [2024-07-26 05:41:53.578686] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b69b0 00:13:39.481 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.049 05:41:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:40.309 05:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.309 "name": "raid_bdev1", 00:13:40.309 "uuid": "5f3cb20b-78ab-4883-b371-b2c26118367e", 00:13:40.309 "strip_size_kb": 64, 00:13:40.309 "state": "online", 00:13:40.309 "raid_level": "raid0", 00:13:40.309 "superblock": true, 00:13:40.309 "num_base_bdevs": 2, 00:13:40.309 "num_base_bdevs_discovered": 2, 00:13:40.309 "num_base_bdevs_operational": 2, 00:13:40.309 "base_bdevs_list": [ 00:13:40.309 { 00:13:40.309 "name": "BaseBdev1", 00:13:40.309 "uuid": "da4f33ef-a3be-5b44-bbf7-e9c237bd0eed", 00:13:40.309 "is_configured": true, 00:13:40.309 "data_offset": 2048, 00:13:40.309 "data_size": 63488 00:13:40.309 }, 00:13:40.309 { 00:13:40.309 "name": "BaseBdev2", 00:13:40.309 "uuid": "5e4f4042-e99b-5998-96ce-6f4fe846e589", 00:13:40.309 "is_configured": true, 00:13:40.309 "data_offset": 2048, 00:13:40.309 "data_size": 63488 00:13:40.309 } 00:13:40.309 ] 00:13:40.309 }' 00:13:40.309 05:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.309 05:41:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.877 05:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:41.136 [2024-07-26 05:41:55.949351] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:41.136 [2024-07-26 05:41:55.949392] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:41.136 [2024-07-26 05:41:55.952555] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:41.137 [2024-07-26 05:41:55.952586] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:41.137 [2024-07-26 05:41:55.952613] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:41.137 [2024-07-26 05:41:55.952624] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12bb320 name raid_bdev1, state offline 00:13:41.137 0 00:13:41.137 05:41:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1131843 00:13:41.137 05:41:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1131843 ']' 00:13:41.137 05:41:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1131843 00:13:41.137 05:41:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:41.137 05:41:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:41.137 05:41:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1131843 00:13:41.137 05:41:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:41.137 05:41:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:41.137 05:41:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1131843' 00:13:41.137 killing process with pid 1131843 00:13:41.137 05:41:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1131843 00:13:41.137 [2024-07-26 05:41:56.020668] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:41.137 05:41:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1131843 00:13:41.137 [2024-07-26 05:41:56.030915] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.HacbyPRyas 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:13:41.396 00:13:41.396 real 0m6.743s 00:13:41.396 user 0m10.804s 00:13:41.396 sys 0m1.174s 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:41.396 05:41:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.396 ************************************ 00:13:41.396 END TEST raid_write_error_test 00:13:41.396 ************************************ 00:13:41.396 05:41:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:41.396 05:41:56 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:41.396 05:41:56 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:13:41.396 05:41:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:41.396 05:41:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:41.396 05:41:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:41.656 ************************************ 00:13:41.656 START TEST raid_state_function_test 00:13:41.656 ************************************ 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1132814 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1132814' 00:13:41.656 Process raid pid: 1132814 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1132814 /var/tmp/spdk-raid.sock 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1132814 ']' 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:41.656 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:41.656 05:41:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.656 [2024-07-26 05:41:56.403839] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:13:41.656 [2024-07-26 05:41:56.403906] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:41.656 [2024-07-26 05:41:56.537443] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:41.916 [2024-07-26 05:41:56.645132] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:41.916 [2024-07-26 05:41:56.716596] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:41.916 [2024-07-26 05:41:56.716631] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:42.483 05:41:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:42.483 05:41:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:42.483 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:43.050 [2024-07-26 05:41:57.796907] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:43.050 [2024-07-26 05:41:57.796947] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:43.050 [2024-07-26 05:41:57.796957] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:43.050 [2024-07-26 05:41:57.796969] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.050 05:41:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:43.308 05:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:43.308 "name": "Existed_Raid", 00:13:43.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.308 "strip_size_kb": 64, 00:13:43.308 "state": "configuring", 00:13:43.308 "raid_level": "concat", 00:13:43.308 "superblock": false, 00:13:43.308 "num_base_bdevs": 2, 00:13:43.308 "num_base_bdevs_discovered": 0, 00:13:43.308 "num_base_bdevs_operational": 2, 00:13:43.308 "base_bdevs_list": [ 00:13:43.308 { 00:13:43.308 "name": "BaseBdev1", 00:13:43.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.308 "is_configured": false, 00:13:43.308 "data_offset": 0, 00:13:43.308 "data_size": 0 00:13:43.308 }, 00:13:43.308 { 00:13:43.308 "name": "BaseBdev2", 00:13:43.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.308 "is_configured": false, 00:13:43.308 "data_offset": 0, 00:13:43.308 "data_size": 0 00:13:43.308 } 00:13:43.308 ] 00:13:43.308 }' 00:13:43.308 05:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:43.308 05:41:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.875 05:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:44.134 [2024-07-26 05:41:58.911741] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:44.134 [2024-07-26 05:41:58.911769] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe07a80 name Existed_Raid, state configuring 00:13:44.134 05:41:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:44.393 [2024-07-26 05:41:59.160415] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:44.393 [2024-07-26 05:41:59.160442] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:44.393 [2024-07-26 05:41:59.160451] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:44.393 [2024-07-26 05:41:59.160462] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:44.393 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:44.652 [2024-07-26 05:41:59.410984] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:44.652 BaseBdev1 00:13:44.652 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:44.652 05:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:44.652 05:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:44.652 05:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:44.652 05:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:44.652 05:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:44.653 05:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:44.911 05:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:45.169 [ 00:13:45.169 { 00:13:45.169 "name": "BaseBdev1", 00:13:45.169 "aliases": [ 00:13:45.169 "c4cfc237-c6bb-428f-94dd-f005bab60263" 00:13:45.169 ], 00:13:45.169 "product_name": "Malloc disk", 00:13:45.169 "block_size": 512, 00:13:45.169 "num_blocks": 65536, 00:13:45.169 "uuid": "c4cfc237-c6bb-428f-94dd-f005bab60263", 00:13:45.169 "assigned_rate_limits": { 00:13:45.169 "rw_ios_per_sec": 0, 00:13:45.169 "rw_mbytes_per_sec": 0, 00:13:45.169 "r_mbytes_per_sec": 0, 00:13:45.169 "w_mbytes_per_sec": 0 00:13:45.169 }, 00:13:45.169 "claimed": true, 00:13:45.169 "claim_type": "exclusive_write", 00:13:45.169 "zoned": false, 00:13:45.169 "supported_io_types": { 00:13:45.169 "read": true, 00:13:45.169 "write": true, 00:13:45.169 "unmap": true, 00:13:45.169 "flush": true, 00:13:45.169 "reset": true, 00:13:45.169 "nvme_admin": false, 00:13:45.169 "nvme_io": false, 00:13:45.169 "nvme_io_md": false, 00:13:45.169 "write_zeroes": true, 00:13:45.169 "zcopy": true, 00:13:45.169 "get_zone_info": false, 00:13:45.169 "zone_management": false, 00:13:45.169 "zone_append": false, 00:13:45.169 "compare": false, 00:13:45.169 "compare_and_write": false, 00:13:45.169 "abort": true, 00:13:45.169 "seek_hole": false, 00:13:45.169 "seek_data": false, 00:13:45.169 "copy": true, 00:13:45.169 "nvme_iov_md": false 00:13:45.169 }, 00:13:45.169 "memory_domains": [ 00:13:45.169 { 00:13:45.169 "dma_device_id": "system", 00:13:45.169 "dma_device_type": 1 00:13:45.169 }, 00:13:45.169 { 00:13:45.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.169 "dma_device_type": 2 00:13:45.169 } 00:13:45.169 ], 00:13:45.169 "driver_specific": {} 00:13:45.170 } 00:13:45.170 ] 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.170 05:41:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.428 05:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.428 "name": "Existed_Raid", 00:13:45.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.428 "strip_size_kb": 64, 00:13:45.428 "state": "configuring", 00:13:45.428 "raid_level": "concat", 00:13:45.428 "superblock": false, 00:13:45.428 "num_base_bdevs": 2, 00:13:45.428 "num_base_bdevs_discovered": 1, 00:13:45.428 "num_base_bdevs_operational": 2, 00:13:45.428 "base_bdevs_list": [ 00:13:45.428 { 00:13:45.428 "name": "BaseBdev1", 00:13:45.428 "uuid": "c4cfc237-c6bb-428f-94dd-f005bab60263", 00:13:45.428 "is_configured": true, 00:13:45.428 "data_offset": 0, 00:13:45.428 "data_size": 65536 00:13:45.428 }, 00:13:45.428 { 00:13:45.428 "name": "BaseBdev2", 00:13:45.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.428 "is_configured": false, 00:13:45.428 "data_offset": 0, 00:13:45.428 "data_size": 0 00:13:45.428 } 00:13:45.428 ] 00:13:45.428 }' 00:13:45.428 05:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.428 05:42:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.993 05:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:46.251 [2024-07-26 05:42:00.955057] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:46.251 [2024-07-26 05:42:00.955096] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe07350 name Existed_Raid, state configuring 00:13:46.251 05:42:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:46.251 [2024-07-26 05:42:01.131568] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:46.251 [2024-07-26 05:42:01.133104] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:46.251 [2024-07-26 05:42:01.133137] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.251 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.508 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.508 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.508 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.508 "name": "Existed_Raid", 00:13:46.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.508 "strip_size_kb": 64, 00:13:46.508 "state": "configuring", 00:13:46.508 "raid_level": "concat", 00:13:46.508 "superblock": false, 00:13:46.508 "num_base_bdevs": 2, 00:13:46.508 "num_base_bdevs_discovered": 1, 00:13:46.508 "num_base_bdevs_operational": 2, 00:13:46.508 "base_bdevs_list": [ 00:13:46.508 { 00:13:46.508 "name": "BaseBdev1", 00:13:46.508 "uuid": "c4cfc237-c6bb-428f-94dd-f005bab60263", 00:13:46.508 "is_configured": true, 00:13:46.508 "data_offset": 0, 00:13:46.508 "data_size": 65536 00:13:46.508 }, 00:13:46.508 { 00:13:46.508 "name": "BaseBdev2", 00:13:46.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.508 "is_configured": false, 00:13:46.508 "data_offset": 0, 00:13:46.508 "data_size": 0 00:13:46.508 } 00:13:46.508 ] 00:13:46.508 }' 00:13:46.508 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.509 05:42:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:47.074 05:42:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:47.332 [2024-07-26 05:42:02.169757] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:47.332 [2024-07-26 05:42:02.169792] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe08000 00:13:47.332 [2024-07-26 05:42:02.169801] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:13:47.332 [2024-07-26 05:42:02.169990] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd220c0 00:13:47.332 [2024-07-26 05:42:02.170110] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe08000 00:13:47.332 [2024-07-26 05:42:02.170125] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe08000 00:13:47.332 [2024-07-26 05:42:02.170283] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:47.332 BaseBdev2 00:13:47.332 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:47.332 05:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:47.332 05:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:47.332 05:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:47.332 05:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:47.332 05:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:47.332 05:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:47.591 05:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:47.851 [ 00:13:47.851 { 00:13:47.851 "name": "BaseBdev2", 00:13:47.851 "aliases": [ 00:13:47.851 "de0677e1-d725-4edb-858b-4e063c07985e" 00:13:47.851 ], 00:13:47.851 "product_name": "Malloc disk", 00:13:47.851 "block_size": 512, 00:13:47.851 "num_blocks": 65536, 00:13:47.851 "uuid": "de0677e1-d725-4edb-858b-4e063c07985e", 00:13:47.851 "assigned_rate_limits": { 00:13:47.851 "rw_ios_per_sec": 0, 00:13:47.851 "rw_mbytes_per_sec": 0, 00:13:47.851 "r_mbytes_per_sec": 0, 00:13:47.851 "w_mbytes_per_sec": 0 00:13:47.851 }, 00:13:47.851 "claimed": true, 00:13:47.851 "claim_type": "exclusive_write", 00:13:47.851 "zoned": false, 00:13:47.851 "supported_io_types": { 00:13:47.851 "read": true, 00:13:47.852 "write": true, 00:13:47.852 "unmap": true, 00:13:47.852 "flush": true, 00:13:47.852 "reset": true, 00:13:47.852 "nvme_admin": false, 00:13:47.852 "nvme_io": false, 00:13:47.852 "nvme_io_md": false, 00:13:47.852 "write_zeroes": true, 00:13:47.852 "zcopy": true, 00:13:47.852 "get_zone_info": false, 00:13:47.852 "zone_management": false, 00:13:47.852 "zone_append": false, 00:13:47.852 "compare": false, 00:13:47.852 "compare_and_write": false, 00:13:47.852 "abort": true, 00:13:47.852 "seek_hole": false, 00:13:47.852 "seek_data": false, 00:13:47.852 "copy": true, 00:13:47.852 "nvme_iov_md": false 00:13:47.852 }, 00:13:47.852 "memory_domains": [ 00:13:47.852 { 00:13:47.852 "dma_device_id": "system", 00:13:47.852 "dma_device_type": 1 00:13:47.852 }, 00:13:47.852 { 00:13:47.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.852 "dma_device_type": 2 00:13:47.852 } 00:13:47.852 ], 00:13:47.852 "driver_specific": {} 00:13:47.852 } 00:13:47.852 ] 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.852 "name": "Existed_Raid", 00:13:47.852 "uuid": "083af477-719c-46a6-94d2-a1ff8d2ed525", 00:13:47.852 "strip_size_kb": 64, 00:13:47.852 "state": "online", 00:13:47.852 "raid_level": "concat", 00:13:47.852 "superblock": false, 00:13:47.852 "num_base_bdevs": 2, 00:13:47.852 "num_base_bdevs_discovered": 2, 00:13:47.852 "num_base_bdevs_operational": 2, 00:13:47.852 "base_bdevs_list": [ 00:13:47.852 { 00:13:47.852 "name": "BaseBdev1", 00:13:47.852 "uuid": "c4cfc237-c6bb-428f-94dd-f005bab60263", 00:13:47.852 "is_configured": true, 00:13:47.852 "data_offset": 0, 00:13:47.852 "data_size": 65536 00:13:47.852 }, 00:13:47.852 { 00:13:47.852 "name": "BaseBdev2", 00:13:47.852 "uuid": "de0677e1-d725-4edb-858b-4e063c07985e", 00:13:47.852 "is_configured": true, 00:13:47.852 "data_offset": 0, 00:13:47.852 "data_size": 65536 00:13:47.852 } 00:13:47.852 ] 00:13:47.852 }' 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.852 05:42:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.418 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:48.418 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:48.418 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:48.418 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:48.418 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:48.418 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:48.418 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:48.418 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:48.717 [2024-07-26 05:42:03.521608] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:48.717 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:48.717 "name": "Existed_Raid", 00:13:48.717 "aliases": [ 00:13:48.717 "083af477-719c-46a6-94d2-a1ff8d2ed525" 00:13:48.717 ], 00:13:48.717 "product_name": "Raid Volume", 00:13:48.717 "block_size": 512, 00:13:48.717 "num_blocks": 131072, 00:13:48.717 "uuid": "083af477-719c-46a6-94d2-a1ff8d2ed525", 00:13:48.717 "assigned_rate_limits": { 00:13:48.717 "rw_ios_per_sec": 0, 00:13:48.717 "rw_mbytes_per_sec": 0, 00:13:48.717 "r_mbytes_per_sec": 0, 00:13:48.717 "w_mbytes_per_sec": 0 00:13:48.717 }, 00:13:48.717 "claimed": false, 00:13:48.717 "zoned": false, 00:13:48.717 "supported_io_types": { 00:13:48.717 "read": true, 00:13:48.717 "write": true, 00:13:48.717 "unmap": true, 00:13:48.717 "flush": true, 00:13:48.717 "reset": true, 00:13:48.717 "nvme_admin": false, 00:13:48.717 "nvme_io": false, 00:13:48.717 "nvme_io_md": false, 00:13:48.717 "write_zeroes": true, 00:13:48.717 "zcopy": false, 00:13:48.717 "get_zone_info": false, 00:13:48.717 "zone_management": false, 00:13:48.717 "zone_append": false, 00:13:48.717 "compare": false, 00:13:48.717 "compare_and_write": false, 00:13:48.717 "abort": false, 00:13:48.717 "seek_hole": false, 00:13:48.717 "seek_data": false, 00:13:48.717 "copy": false, 00:13:48.717 "nvme_iov_md": false 00:13:48.717 }, 00:13:48.717 "memory_domains": [ 00:13:48.717 { 00:13:48.717 "dma_device_id": "system", 00:13:48.717 "dma_device_type": 1 00:13:48.717 }, 00:13:48.717 { 00:13:48.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.717 "dma_device_type": 2 00:13:48.717 }, 00:13:48.717 { 00:13:48.717 "dma_device_id": "system", 00:13:48.717 "dma_device_type": 1 00:13:48.717 }, 00:13:48.717 { 00:13:48.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.717 "dma_device_type": 2 00:13:48.717 } 00:13:48.717 ], 00:13:48.717 "driver_specific": { 00:13:48.717 "raid": { 00:13:48.717 "uuid": "083af477-719c-46a6-94d2-a1ff8d2ed525", 00:13:48.717 "strip_size_kb": 64, 00:13:48.717 "state": "online", 00:13:48.717 "raid_level": "concat", 00:13:48.717 "superblock": false, 00:13:48.717 "num_base_bdevs": 2, 00:13:48.717 "num_base_bdevs_discovered": 2, 00:13:48.717 "num_base_bdevs_operational": 2, 00:13:48.717 "base_bdevs_list": [ 00:13:48.717 { 00:13:48.717 "name": "BaseBdev1", 00:13:48.717 "uuid": "c4cfc237-c6bb-428f-94dd-f005bab60263", 00:13:48.717 "is_configured": true, 00:13:48.717 "data_offset": 0, 00:13:48.717 "data_size": 65536 00:13:48.717 }, 00:13:48.717 { 00:13:48.718 "name": "BaseBdev2", 00:13:48.718 "uuid": "de0677e1-d725-4edb-858b-4e063c07985e", 00:13:48.718 "is_configured": true, 00:13:48.718 "data_offset": 0, 00:13:48.718 "data_size": 65536 00:13:48.718 } 00:13:48.718 ] 00:13:48.718 } 00:13:48.718 } 00:13:48.718 }' 00:13:48.718 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:48.718 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:48.718 BaseBdev2' 00:13:48.718 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:48.718 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:48.718 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:48.991 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:48.991 "name": "BaseBdev1", 00:13:48.991 "aliases": [ 00:13:48.991 "c4cfc237-c6bb-428f-94dd-f005bab60263" 00:13:48.991 ], 00:13:48.991 "product_name": "Malloc disk", 00:13:48.991 "block_size": 512, 00:13:48.991 "num_blocks": 65536, 00:13:48.991 "uuid": "c4cfc237-c6bb-428f-94dd-f005bab60263", 00:13:48.991 "assigned_rate_limits": { 00:13:48.991 "rw_ios_per_sec": 0, 00:13:48.991 "rw_mbytes_per_sec": 0, 00:13:48.991 "r_mbytes_per_sec": 0, 00:13:48.991 "w_mbytes_per_sec": 0 00:13:48.991 }, 00:13:48.991 "claimed": true, 00:13:48.991 "claim_type": "exclusive_write", 00:13:48.991 "zoned": false, 00:13:48.991 "supported_io_types": { 00:13:48.991 "read": true, 00:13:48.991 "write": true, 00:13:48.991 "unmap": true, 00:13:48.991 "flush": true, 00:13:48.991 "reset": true, 00:13:48.991 "nvme_admin": false, 00:13:48.991 "nvme_io": false, 00:13:48.991 "nvme_io_md": false, 00:13:48.991 "write_zeroes": true, 00:13:48.991 "zcopy": true, 00:13:48.991 "get_zone_info": false, 00:13:48.991 "zone_management": false, 00:13:48.991 "zone_append": false, 00:13:48.991 "compare": false, 00:13:48.991 "compare_and_write": false, 00:13:48.991 "abort": true, 00:13:48.991 "seek_hole": false, 00:13:48.991 "seek_data": false, 00:13:48.991 "copy": true, 00:13:48.991 "nvme_iov_md": false 00:13:48.991 }, 00:13:48.991 "memory_domains": [ 00:13:48.991 { 00:13:48.991 "dma_device_id": "system", 00:13:48.991 "dma_device_type": 1 00:13:48.991 }, 00:13:48.991 { 00:13:48.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.991 "dma_device_type": 2 00:13:48.991 } 00:13:48.991 ], 00:13:48.991 "driver_specific": {} 00:13:48.991 }' 00:13:48.991 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.991 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.991 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:48.991 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.249 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.249 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.249 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.249 05:42:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.249 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.249 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.249 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.249 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.249 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:49.249 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:49.249 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.508 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.508 "name": "BaseBdev2", 00:13:49.508 "aliases": [ 00:13:49.508 "de0677e1-d725-4edb-858b-4e063c07985e" 00:13:49.508 ], 00:13:49.508 "product_name": "Malloc disk", 00:13:49.508 "block_size": 512, 00:13:49.508 "num_blocks": 65536, 00:13:49.508 "uuid": "de0677e1-d725-4edb-858b-4e063c07985e", 00:13:49.508 "assigned_rate_limits": { 00:13:49.508 "rw_ios_per_sec": 0, 00:13:49.508 "rw_mbytes_per_sec": 0, 00:13:49.508 "r_mbytes_per_sec": 0, 00:13:49.508 "w_mbytes_per_sec": 0 00:13:49.508 }, 00:13:49.508 "claimed": true, 00:13:49.508 "claim_type": "exclusive_write", 00:13:49.508 "zoned": false, 00:13:49.508 "supported_io_types": { 00:13:49.508 "read": true, 00:13:49.508 "write": true, 00:13:49.508 "unmap": true, 00:13:49.508 "flush": true, 00:13:49.508 "reset": true, 00:13:49.508 "nvme_admin": false, 00:13:49.508 "nvme_io": false, 00:13:49.508 "nvme_io_md": false, 00:13:49.508 "write_zeroes": true, 00:13:49.508 "zcopy": true, 00:13:49.508 "get_zone_info": false, 00:13:49.508 "zone_management": false, 00:13:49.508 "zone_append": false, 00:13:49.508 "compare": false, 00:13:49.508 "compare_and_write": false, 00:13:49.508 "abort": true, 00:13:49.508 "seek_hole": false, 00:13:49.508 "seek_data": false, 00:13:49.508 "copy": true, 00:13:49.508 "nvme_iov_md": false 00:13:49.508 }, 00:13:49.508 "memory_domains": [ 00:13:49.508 { 00:13:49.508 "dma_device_id": "system", 00:13:49.508 "dma_device_type": 1 00:13:49.508 }, 00:13:49.508 { 00:13:49.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.508 "dma_device_type": 2 00:13:49.508 } 00:13:49.508 ], 00:13:49.508 "driver_specific": {} 00:13:49.508 }' 00:13:49.508 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.508 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.766 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:49.766 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.766 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.766 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.766 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.766 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.766 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.766 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.766 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:50.025 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:50.025 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:50.283 [2024-07-26 05:42:04.933283] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:50.283 [2024-07-26 05:42:04.933309] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:50.283 [2024-07-26 05:42:04.933348] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:50.283 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.284 05:42:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.542 05:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.542 "name": "Existed_Raid", 00:13:50.542 "uuid": "083af477-719c-46a6-94d2-a1ff8d2ed525", 00:13:50.542 "strip_size_kb": 64, 00:13:50.542 "state": "offline", 00:13:50.542 "raid_level": "concat", 00:13:50.542 "superblock": false, 00:13:50.542 "num_base_bdevs": 2, 00:13:50.542 "num_base_bdevs_discovered": 1, 00:13:50.542 "num_base_bdevs_operational": 1, 00:13:50.542 "base_bdevs_list": [ 00:13:50.542 { 00:13:50.542 "name": null, 00:13:50.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.542 "is_configured": false, 00:13:50.542 "data_offset": 0, 00:13:50.542 "data_size": 65536 00:13:50.542 }, 00:13:50.542 { 00:13:50.542 "name": "BaseBdev2", 00:13:50.542 "uuid": "de0677e1-d725-4edb-858b-4e063c07985e", 00:13:50.542 "is_configured": true, 00:13:50.542 "data_offset": 0, 00:13:50.542 "data_size": 65536 00:13:50.542 } 00:13:50.542 ] 00:13:50.542 }' 00:13:50.542 05:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.542 05:42:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.109 05:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:51.109 05:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:51.109 05:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.109 05:42:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:51.368 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:51.368 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:51.368 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:51.368 [2024-07-26 05:42:06.273830] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:51.368 [2024-07-26 05:42:06.273877] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe08000 name Existed_Raid, state offline 00:13:51.626 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:51.626 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:51.626 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.626 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1132814 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1132814 ']' 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1132814 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1132814 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1132814' 00:13:52.193 killing process with pid 1132814 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1132814 00:13:52.193 [2024-07-26 05:42:06.858613] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:52.193 05:42:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1132814 00:13:52.193 [2024-07-26 05:42:06.859607] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:52.193 05:42:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:52.193 00:13:52.193 real 0m10.748s 00:13:52.193 user 0m19.052s 00:13:52.193 sys 0m2.016s 00:13:52.193 05:42:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:52.193 05:42:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:52.193 ************************************ 00:13:52.193 END TEST raid_state_function_test 00:13:52.193 ************************************ 00:13:52.451 05:42:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:52.451 05:42:07 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:13:52.451 05:42:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:52.451 05:42:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:52.451 05:42:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:52.451 ************************************ 00:13:52.451 START TEST raid_state_function_test_sb 00:13:52.451 ************************************ 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1134958 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1134958' 00:13:52.451 Process raid pid: 1134958 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1134958 /var/tmp/spdk-raid.sock 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1134958 ']' 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:52.451 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:52.451 05:42:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:52.451 [2024-07-26 05:42:07.240035] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:13:52.451 [2024-07-26 05:42:07.240111] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:52.710 [2024-07-26 05:42:07.374342] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:52.710 [2024-07-26 05:42:07.477028] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:52.710 [2024-07-26 05:42:07.534671] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:52.710 [2024-07-26 05:42:07.534701] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:53.276 05:42:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:53.276 05:42:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:53.276 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:53.535 [2024-07-26 05:42:08.388927] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:53.535 [2024-07-26 05:42:08.388967] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:53.535 [2024-07-26 05:42:08.388977] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:53.535 [2024-07-26 05:42:08.388989] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.535 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.794 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.794 "name": "Existed_Raid", 00:13:53.794 "uuid": "977d4409-f449-4b13-9469-2f3007b51307", 00:13:53.794 "strip_size_kb": 64, 00:13:53.794 "state": "configuring", 00:13:53.794 "raid_level": "concat", 00:13:53.794 "superblock": true, 00:13:53.794 "num_base_bdevs": 2, 00:13:53.794 "num_base_bdevs_discovered": 0, 00:13:53.794 "num_base_bdevs_operational": 2, 00:13:53.794 "base_bdevs_list": [ 00:13:53.794 { 00:13:53.794 "name": "BaseBdev1", 00:13:53.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.794 "is_configured": false, 00:13:53.794 "data_offset": 0, 00:13:53.794 "data_size": 0 00:13:53.794 }, 00:13:53.794 { 00:13:53.794 "name": "BaseBdev2", 00:13:53.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.794 "is_configured": false, 00:13:53.794 "data_offset": 0, 00:13:53.794 "data_size": 0 00:13:53.794 } 00:13:53.794 ] 00:13:53.794 }' 00:13:53.794 05:42:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.794 05:42:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:54.361 05:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:54.621 [2024-07-26 05:42:09.471644] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:54.621 [2024-07-26 05:42:09.471676] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24f6a80 name Existed_Raid, state configuring 00:13:54.621 05:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:54.880 [2024-07-26 05:42:09.712306] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:54.880 [2024-07-26 05:42:09.712337] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:54.880 [2024-07-26 05:42:09.712346] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:54.880 [2024-07-26 05:42:09.712357] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:54.880 05:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:55.139 [2024-07-26 05:42:09.964076] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:55.139 BaseBdev1 00:13:55.139 05:42:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:55.139 05:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:55.139 05:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:55.139 05:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:55.139 05:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:55.139 05:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:55.139 05:42:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:55.398 05:42:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:55.657 [ 00:13:55.657 { 00:13:55.657 "name": "BaseBdev1", 00:13:55.657 "aliases": [ 00:13:55.657 "ad29265d-9478-4189-b807-3ecb830a54e7" 00:13:55.657 ], 00:13:55.657 "product_name": "Malloc disk", 00:13:55.657 "block_size": 512, 00:13:55.657 "num_blocks": 65536, 00:13:55.657 "uuid": "ad29265d-9478-4189-b807-3ecb830a54e7", 00:13:55.657 "assigned_rate_limits": { 00:13:55.657 "rw_ios_per_sec": 0, 00:13:55.657 "rw_mbytes_per_sec": 0, 00:13:55.657 "r_mbytes_per_sec": 0, 00:13:55.657 "w_mbytes_per_sec": 0 00:13:55.657 }, 00:13:55.657 "claimed": true, 00:13:55.657 "claim_type": "exclusive_write", 00:13:55.657 "zoned": false, 00:13:55.657 "supported_io_types": { 00:13:55.657 "read": true, 00:13:55.657 "write": true, 00:13:55.657 "unmap": true, 00:13:55.657 "flush": true, 00:13:55.657 "reset": true, 00:13:55.657 "nvme_admin": false, 00:13:55.657 "nvme_io": false, 00:13:55.657 "nvme_io_md": false, 00:13:55.657 "write_zeroes": true, 00:13:55.657 "zcopy": true, 00:13:55.657 "get_zone_info": false, 00:13:55.657 "zone_management": false, 00:13:55.657 "zone_append": false, 00:13:55.657 "compare": false, 00:13:55.657 "compare_and_write": false, 00:13:55.657 "abort": true, 00:13:55.657 "seek_hole": false, 00:13:55.657 "seek_data": false, 00:13:55.657 "copy": true, 00:13:55.657 "nvme_iov_md": false 00:13:55.657 }, 00:13:55.657 "memory_domains": [ 00:13:55.657 { 00:13:55.657 "dma_device_id": "system", 00:13:55.657 "dma_device_type": 1 00:13:55.657 }, 00:13:55.657 { 00:13:55.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:55.657 "dma_device_type": 2 00:13:55.657 } 00:13:55.657 ], 00:13:55.657 "driver_specific": {} 00:13:55.657 } 00:13:55.657 ] 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.657 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.916 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.916 "name": "Existed_Raid", 00:13:55.916 "uuid": "19a884d9-731f-4274-93d1-46fb5e751a36", 00:13:55.916 "strip_size_kb": 64, 00:13:55.916 "state": "configuring", 00:13:55.916 "raid_level": "concat", 00:13:55.916 "superblock": true, 00:13:55.916 "num_base_bdevs": 2, 00:13:55.916 "num_base_bdevs_discovered": 1, 00:13:55.916 "num_base_bdevs_operational": 2, 00:13:55.916 "base_bdevs_list": [ 00:13:55.916 { 00:13:55.916 "name": "BaseBdev1", 00:13:55.916 "uuid": "ad29265d-9478-4189-b807-3ecb830a54e7", 00:13:55.916 "is_configured": true, 00:13:55.916 "data_offset": 2048, 00:13:55.916 "data_size": 63488 00:13:55.916 }, 00:13:55.916 { 00:13:55.916 "name": "BaseBdev2", 00:13:55.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.916 "is_configured": false, 00:13:55.916 "data_offset": 0, 00:13:55.916 "data_size": 0 00:13:55.916 } 00:13:55.916 ] 00:13:55.916 }' 00:13:55.916 05:42:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.916 05:42:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:56.484 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:56.742 [2024-07-26 05:42:11.504153] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:56.742 [2024-07-26 05:42:11.504195] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24f6350 name Existed_Raid, state configuring 00:13:56.742 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:57.001 [2024-07-26 05:42:11.752853] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:57.001 [2024-07-26 05:42:11.754338] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:57.001 [2024-07-26 05:42:11.754369] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.001 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.002 05:42:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.276 05:42:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.276 "name": "Existed_Raid", 00:13:57.276 "uuid": "834da30d-2519-448c-81ec-330160f0c629", 00:13:57.276 "strip_size_kb": 64, 00:13:57.276 "state": "configuring", 00:13:57.276 "raid_level": "concat", 00:13:57.276 "superblock": true, 00:13:57.276 "num_base_bdevs": 2, 00:13:57.276 "num_base_bdevs_discovered": 1, 00:13:57.276 "num_base_bdevs_operational": 2, 00:13:57.276 "base_bdevs_list": [ 00:13:57.276 { 00:13:57.276 "name": "BaseBdev1", 00:13:57.276 "uuid": "ad29265d-9478-4189-b807-3ecb830a54e7", 00:13:57.276 "is_configured": true, 00:13:57.276 "data_offset": 2048, 00:13:57.276 "data_size": 63488 00:13:57.276 }, 00:13:57.276 { 00:13:57.276 "name": "BaseBdev2", 00:13:57.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.276 "is_configured": false, 00:13:57.276 "data_offset": 0, 00:13:57.276 "data_size": 0 00:13:57.276 } 00:13:57.276 ] 00:13:57.276 }' 00:13:57.276 05:42:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.276 05:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.852 05:42:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:58.111 [2024-07-26 05:42:12.839125] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:58.112 [2024-07-26 05:42:12.839275] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24f7000 00:13:58.112 [2024-07-26 05:42:12.839289] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:58.112 [2024-07-26 05:42:12.839462] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24110c0 00:13:58.112 [2024-07-26 05:42:12.839576] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24f7000 00:13:58.112 [2024-07-26 05:42:12.839586] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24f7000 00:13:58.112 [2024-07-26 05:42:12.839685] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:58.112 BaseBdev2 00:13:58.112 05:42:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:58.112 05:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:58.112 05:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:58.112 05:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:58.112 05:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:58.112 05:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:58.112 05:42:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:58.370 05:42:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:58.629 [ 00:13:58.629 { 00:13:58.629 "name": "BaseBdev2", 00:13:58.629 "aliases": [ 00:13:58.629 "3f124d3f-d363-492d-ab03-66b3aeb58699" 00:13:58.629 ], 00:13:58.629 "product_name": "Malloc disk", 00:13:58.629 "block_size": 512, 00:13:58.629 "num_blocks": 65536, 00:13:58.629 "uuid": "3f124d3f-d363-492d-ab03-66b3aeb58699", 00:13:58.629 "assigned_rate_limits": { 00:13:58.629 "rw_ios_per_sec": 0, 00:13:58.629 "rw_mbytes_per_sec": 0, 00:13:58.629 "r_mbytes_per_sec": 0, 00:13:58.629 "w_mbytes_per_sec": 0 00:13:58.629 }, 00:13:58.629 "claimed": true, 00:13:58.629 "claim_type": "exclusive_write", 00:13:58.629 "zoned": false, 00:13:58.629 "supported_io_types": { 00:13:58.629 "read": true, 00:13:58.629 "write": true, 00:13:58.629 "unmap": true, 00:13:58.629 "flush": true, 00:13:58.629 "reset": true, 00:13:58.629 "nvme_admin": false, 00:13:58.629 "nvme_io": false, 00:13:58.629 "nvme_io_md": false, 00:13:58.629 "write_zeroes": true, 00:13:58.630 "zcopy": true, 00:13:58.630 "get_zone_info": false, 00:13:58.630 "zone_management": false, 00:13:58.630 "zone_append": false, 00:13:58.630 "compare": false, 00:13:58.630 "compare_and_write": false, 00:13:58.630 "abort": true, 00:13:58.630 "seek_hole": false, 00:13:58.630 "seek_data": false, 00:13:58.630 "copy": true, 00:13:58.630 "nvme_iov_md": false 00:13:58.630 }, 00:13:58.630 "memory_domains": [ 00:13:58.630 { 00:13:58.630 "dma_device_id": "system", 00:13:58.630 "dma_device_type": 1 00:13:58.630 }, 00:13:58.630 { 00:13:58.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.630 "dma_device_type": 2 00:13:58.630 } 00:13:58.630 ], 00:13:58.630 "driver_specific": {} 00:13:58.630 } 00:13:58.630 ] 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.630 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.888 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.888 "name": "Existed_Raid", 00:13:58.888 "uuid": "834da30d-2519-448c-81ec-330160f0c629", 00:13:58.888 "strip_size_kb": 64, 00:13:58.888 "state": "online", 00:13:58.888 "raid_level": "concat", 00:13:58.888 "superblock": true, 00:13:58.888 "num_base_bdevs": 2, 00:13:58.888 "num_base_bdevs_discovered": 2, 00:13:58.888 "num_base_bdevs_operational": 2, 00:13:58.888 "base_bdevs_list": [ 00:13:58.888 { 00:13:58.888 "name": "BaseBdev1", 00:13:58.888 "uuid": "ad29265d-9478-4189-b807-3ecb830a54e7", 00:13:58.888 "is_configured": true, 00:13:58.888 "data_offset": 2048, 00:13:58.888 "data_size": 63488 00:13:58.888 }, 00:13:58.888 { 00:13:58.888 "name": "BaseBdev2", 00:13:58.888 "uuid": "3f124d3f-d363-492d-ab03-66b3aeb58699", 00:13:58.888 "is_configured": true, 00:13:58.888 "data_offset": 2048, 00:13:58.888 "data_size": 63488 00:13:58.888 } 00:13:58.888 ] 00:13:58.888 }' 00:13:58.888 05:42:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.888 05:42:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:59.456 [2024-07-26 05:42:14.323318] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:59.456 "name": "Existed_Raid", 00:13:59.456 "aliases": [ 00:13:59.456 "834da30d-2519-448c-81ec-330160f0c629" 00:13:59.456 ], 00:13:59.456 "product_name": "Raid Volume", 00:13:59.456 "block_size": 512, 00:13:59.456 "num_blocks": 126976, 00:13:59.456 "uuid": "834da30d-2519-448c-81ec-330160f0c629", 00:13:59.456 "assigned_rate_limits": { 00:13:59.456 "rw_ios_per_sec": 0, 00:13:59.456 "rw_mbytes_per_sec": 0, 00:13:59.456 "r_mbytes_per_sec": 0, 00:13:59.456 "w_mbytes_per_sec": 0 00:13:59.456 }, 00:13:59.456 "claimed": false, 00:13:59.456 "zoned": false, 00:13:59.456 "supported_io_types": { 00:13:59.456 "read": true, 00:13:59.456 "write": true, 00:13:59.456 "unmap": true, 00:13:59.456 "flush": true, 00:13:59.456 "reset": true, 00:13:59.456 "nvme_admin": false, 00:13:59.456 "nvme_io": false, 00:13:59.456 "nvme_io_md": false, 00:13:59.456 "write_zeroes": true, 00:13:59.456 "zcopy": false, 00:13:59.456 "get_zone_info": false, 00:13:59.456 "zone_management": false, 00:13:59.456 "zone_append": false, 00:13:59.456 "compare": false, 00:13:59.456 "compare_and_write": false, 00:13:59.456 "abort": false, 00:13:59.456 "seek_hole": false, 00:13:59.456 "seek_data": false, 00:13:59.456 "copy": false, 00:13:59.456 "nvme_iov_md": false 00:13:59.456 }, 00:13:59.456 "memory_domains": [ 00:13:59.456 { 00:13:59.456 "dma_device_id": "system", 00:13:59.456 "dma_device_type": 1 00:13:59.456 }, 00:13:59.456 { 00:13:59.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.456 "dma_device_type": 2 00:13:59.456 }, 00:13:59.456 { 00:13:59.456 "dma_device_id": "system", 00:13:59.456 "dma_device_type": 1 00:13:59.456 }, 00:13:59.456 { 00:13:59.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.456 "dma_device_type": 2 00:13:59.456 } 00:13:59.456 ], 00:13:59.456 "driver_specific": { 00:13:59.456 "raid": { 00:13:59.456 "uuid": "834da30d-2519-448c-81ec-330160f0c629", 00:13:59.456 "strip_size_kb": 64, 00:13:59.456 "state": "online", 00:13:59.456 "raid_level": "concat", 00:13:59.456 "superblock": true, 00:13:59.456 "num_base_bdevs": 2, 00:13:59.456 "num_base_bdevs_discovered": 2, 00:13:59.456 "num_base_bdevs_operational": 2, 00:13:59.456 "base_bdevs_list": [ 00:13:59.456 { 00:13:59.456 "name": "BaseBdev1", 00:13:59.456 "uuid": "ad29265d-9478-4189-b807-3ecb830a54e7", 00:13:59.456 "is_configured": true, 00:13:59.456 "data_offset": 2048, 00:13:59.456 "data_size": 63488 00:13:59.456 }, 00:13:59.456 { 00:13:59.456 "name": "BaseBdev2", 00:13:59.456 "uuid": "3f124d3f-d363-492d-ab03-66b3aeb58699", 00:13:59.456 "is_configured": true, 00:13:59.456 "data_offset": 2048, 00:13:59.456 "data_size": 63488 00:13:59.456 } 00:13:59.456 ] 00:13:59.456 } 00:13:59.456 } 00:13:59.456 }' 00:13:59.456 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:59.715 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:59.715 BaseBdev2' 00:13:59.715 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:59.715 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:59.715 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:59.715 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:59.715 "name": "BaseBdev1", 00:13:59.715 "aliases": [ 00:13:59.715 "ad29265d-9478-4189-b807-3ecb830a54e7" 00:13:59.715 ], 00:13:59.715 "product_name": "Malloc disk", 00:13:59.715 "block_size": 512, 00:13:59.715 "num_blocks": 65536, 00:13:59.715 "uuid": "ad29265d-9478-4189-b807-3ecb830a54e7", 00:13:59.715 "assigned_rate_limits": { 00:13:59.715 "rw_ios_per_sec": 0, 00:13:59.715 "rw_mbytes_per_sec": 0, 00:13:59.715 "r_mbytes_per_sec": 0, 00:13:59.715 "w_mbytes_per_sec": 0 00:13:59.715 }, 00:13:59.715 "claimed": true, 00:13:59.715 "claim_type": "exclusive_write", 00:13:59.715 "zoned": false, 00:13:59.715 "supported_io_types": { 00:13:59.715 "read": true, 00:13:59.715 "write": true, 00:13:59.715 "unmap": true, 00:13:59.715 "flush": true, 00:13:59.715 "reset": true, 00:13:59.715 "nvme_admin": false, 00:13:59.716 "nvme_io": false, 00:13:59.716 "nvme_io_md": false, 00:13:59.716 "write_zeroes": true, 00:13:59.716 "zcopy": true, 00:13:59.716 "get_zone_info": false, 00:13:59.716 "zone_management": false, 00:13:59.716 "zone_append": false, 00:13:59.716 "compare": false, 00:13:59.716 "compare_and_write": false, 00:13:59.716 "abort": true, 00:13:59.716 "seek_hole": false, 00:13:59.716 "seek_data": false, 00:13:59.716 "copy": true, 00:13:59.716 "nvme_iov_md": false 00:13:59.716 }, 00:13:59.716 "memory_domains": [ 00:13:59.716 { 00:13:59.716 "dma_device_id": "system", 00:13:59.716 "dma_device_type": 1 00:13:59.716 }, 00:13:59.716 { 00:13:59.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.716 "dma_device_type": 2 00:13:59.716 } 00:13:59.716 ], 00:13:59.716 "driver_specific": {} 00:13:59.716 }' 00:13:59.716 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:59.974 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:59.974 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:59.974 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:59.974 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:59.974 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:59.974 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:59.974 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.233 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:00.233 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.233 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.233 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:00.233 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:00.233 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:00.233 05:42:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:00.492 "name": "BaseBdev2", 00:14:00.492 "aliases": [ 00:14:00.492 "3f124d3f-d363-492d-ab03-66b3aeb58699" 00:14:00.492 ], 00:14:00.492 "product_name": "Malloc disk", 00:14:00.492 "block_size": 512, 00:14:00.492 "num_blocks": 65536, 00:14:00.492 "uuid": "3f124d3f-d363-492d-ab03-66b3aeb58699", 00:14:00.492 "assigned_rate_limits": { 00:14:00.492 "rw_ios_per_sec": 0, 00:14:00.492 "rw_mbytes_per_sec": 0, 00:14:00.492 "r_mbytes_per_sec": 0, 00:14:00.492 "w_mbytes_per_sec": 0 00:14:00.492 }, 00:14:00.492 "claimed": true, 00:14:00.492 "claim_type": "exclusive_write", 00:14:00.492 "zoned": false, 00:14:00.492 "supported_io_types": { 00:14:00.492 "read": true, 00:14:00.492 "write": true, 00:14:00.492 "unmap": true, 00:14:00.492 "flush": true, 00:14:00.492 "reset": true, 00:14:00.492 "nvme_admin": false, 00:14:00.492 "nvme_io": false, 00:14:00.492 "nvme_io_md": false, 00:14:00.492 "write_zeroes": true, 00:14:00.492 "zcopy": true, 00:14:00.492 "get_zone_info": false, 00:14:00.492 "zone_management": false, 00:14:00.492 "zone_append": false, 00:14:00.492 "compare": false, 00:14:00.492 "compare_and_write": false, 00:14:00.492 "abort": true, 00:14:00.492 "seek_hole": false, 00:14:00.492 "seek_data": false, 00:14:00.492 "copy": true, 00:14:00.492 "nvme_iov_md": false 00:14:00.492 }, 00:14:00.492 "memory_domains": [ 00:14:00.492 { 00:14:00.492 "dma_device_id": "system", 00:14:00.492 "dma_device_type": 1 00:14:00.492 }, 00:14:00.492 { 00:14:00.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.492 "dma_device_type": 2 00:14:00.492 } 00:14:00.492 ], 00:14:00.492 "driver_specific": {} 00:14:00.492 }' 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:00.492 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.751 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.751 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:00.751 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:01.010 [2024-07-26 05:42:15.690717] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:01.010 [2024-07-26 05:42:15.690745] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:01.010 [2024-07-26 05:42:15.690786] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.010 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.269 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.269 "name": "Existed_Raid", 00:14:01.269 "uuid": "834da30d-2519-448c-81ec-330160f0c629", 00:14:01.269 "strip_size_kb": 64, 00:14:01.269 "state": "offline", 00:14:01.269 "raid_level": "concat", 00:14:01.269 "superblock": true, 00:14:01.269 "num_base_bdevs": 2, 00:14:01.269 "num_base_bdevs_discovered": 1, 00:14:01.269 "num_base_bdevs_operational": 1, 00:14:01.269 "base_bdevs_list": [ 00:14:01.269 { 00:14:01.269 "name": null, 00:14:01.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.269 "is_configured": false, 00:14:01.269 "data_offset": 2048, 00:14:01.269 "data_size": 63488 00:14:01.269 }, 00:14:01.269 { 00:14:01.269 "name": "BaseBdev2", 00:14:01.269 "uuid": "3f124d3f-d363-492d-ab03-66b3aeb58699", 00:14:01.269 "is_configured": true, 00:14:01.269 "data_offset": 2048, 00:14:01.269 "data_size": 63488 00:14:01.269 } 00:14:01.269 ] 00:14:01.269 }' 00:14:01.269 05:42:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.269 05:42:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.836 05:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:01.836 05:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:01.836 05:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:01.836 05:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.095 05:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:02.095 05:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:02.095 05:42:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:02.391 [2024-07-26 05:42:17.015480] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:02.391 [2024-07-26 05:42:17.015530] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24f7000 name Existed_Raid, state offline 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1134958 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1134958 ']' 00:14:02.391 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1134958 00:14:02.673 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:02.673 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:02.673 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1134958 00:14:02.673 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:02.673 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:02.673 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1134958' 00:14:02.673 killing process with pid 1134958 00:14:02.673 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1134958 00:14:02.673 [2024-07-26 05:42:17.345942] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:02.673 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1134958 00:14:02.673 [2024-07-26 05:42:17.346922] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:02.673 05:42:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:02.673 00:14:02.673 real 0m10.405s 00:14:02.674 user 0m18.522s 00:14:02.674 sys 0m1.928s 00:14:02.674 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:02.674 05:42:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:02.674 ************************************ 00:14:02.674 END TEST raid_state_function_test_sb 00:14:02.674 ************************************ 00:14:02.932 05:42:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:02.932 05:42:17 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:14:02.932 05:42:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:02.932 05:42:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:02.932 05:42:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:02.932 ************************************ 00:14:02.932 START TEST raid_superblock_test 00:14:02.932 ************************************ 00:14:02.932 05:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1136590 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1136590 /var/tmp/spdk-raid.sock 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1136590 ']' 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:02.933 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:02.933 05:42:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.933 [2024-07-26 05:42:17.725924] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:14:02.933 [2024-07-26 05:42:17.725996] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1136590 ] 00:14:03.191 [2024-07-26 05:42:17.856050] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:03.191 [2024-07-26 05:42:17.958189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:03.191 [2024-07-26 05:42:18.019863] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:03.191 [2024-07-26 05:42:18.019901] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:03.757 05:42:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:04.015 05:42:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:04.015 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:04.015 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:04.016 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:04.016 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:04.016 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:04.016 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:04.016 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:04.016 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:04.016 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:04.016 malloc1 00:14:04.016 05:42:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:04.274 [2024-07-26 05:42:19.137741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:04.274 [2024-07-26 05:42:19.137788] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:04.274 [2024-07-26 05:42:19.137808] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x234e570 00:14:04.274 [2024-07-26 05:42:19.137820] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:04.274 [2024-07-26 05:42:19.139393] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:04.274 [2024-07-26 05:42:19.139421] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:04.274 pt1 00:14:04.274 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:04.274 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:04.274 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:04.274 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:04.274 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:04.274 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:04.274 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:04.274 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:04.274 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:04.533 malloc2 00:14:04.533 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:04.791 [2024-07-26 05:42:19.639747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:04.791 [2024-07-26 05:42:19.639791] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:04.791 [2024-07-26 05:42:19.639809] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x234f970 00:14:04.791 [2024-07-26 05:42:19.639821] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:04.791 [2024-07-26 05:42:19.641303] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:04.791 [2024-07-26 05:42:19.641332] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:04.791 pt2 00:14:04.791 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:04.791 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:04.791 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:14:05.050 [2024-07-26 05:42:19.884433] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:05.050 [2024-07-26 05:42:19.885701] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:05.050 [2024-07-26 05:42:19.885841] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24f2270 00:14:05.050 [2024-07-26 05:42:19.885855] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:05.050 [2024-07-26 05:42:19.886050] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e7c10 00:14:05.050 [2024-07-26 05:42:19.886196] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24f2270 00:14:05.050 [2024-07-26 05:42:19.886206] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24f2270 00:14:05.050 [2024-07-26 05:42:19.886309] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.050 05:42:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:05.308 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.308 "name": "raid_bdev1", 00:14:05.308 "uuid": "7f3085e9-c07c-469c-a2e3-f07df26fdf61", 00:14:05.308 "strip_size_kb": 64, 00:14:05.308 "state": "online", 00:14:05.308 "raid_level": "concat", 00:14:05.308 "superblock": true, 00:14:05.308 "num_base_bdevs": 2, 00:14:05.308 "num_base_bdevs_discovered": 2, 00:14:05.308 "num_base_bdevs_operational": 2, 00:14:05.308 "base_bdevs_list": [ 00:14:05.308 { 00:14:05.308 "name": "pt1", 00:14:05.308 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:05.308 "is_configured": true, 00:14:05.308 "data_offset": 2048, 00:14:05.308 "data_size": 63488 00:14:05.308 }, 00:14:05.308 { 00:14:05.308 "name": "pt2", 00:14:05.308 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:05.308 "is_configured": true, 00:14:05.308 "data_offset": 2048, 00:14:05.308 "data_size": 63488 00:14:05.308 } 00:14:05.308 ] 00:14:05.308 }' 00:14:05.308 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.308 05:42:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.875 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:05.875 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:05.875 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:05.875 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:05.875 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:05.875 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:05.875 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:05.875 05:42:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:06.133 [2024-07-26 05:42:20.983731] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:06.133 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:06.133 "name": "raid_bdev1", 00:14:06.133 "aliases": [ 00:14:06.133 "7f3085e9-c07c-469c-a2e3-f07df26fdf61" 00:14:06.133 ], 00:14:06.133 "product_name": "Raid Volume", 00:14:06.133 "block_size": 512, 00:14:06.133 "num_blocks": 126976, 00:14:06.133 "uuid": "7f3085e9-c07c-469c-a2e3-f07df26fdf61", 00:14:06.133 "assigned_rate_limits": { 00:14:06.133 "rw_ios_per_sec": 0, 00:14:06.133 "rw_mbytes_per_sec": 0, 00:14:06.133 "r_mbytes_per_sec": 0, 00:14:06.133 "w_mbytes_per_sec": 0 00:14:06.133 }, 00:14:06.133 "claimed": false, 00:14:06.133 "zoned": false, 00:14:06.133 "supported_io_types": { 00:14:06.133 "read": true, 00:14:06.133 "write": true, 00:14:06.133 "unmap": true, 00:14:06.133 "flush": true, 00:14:06.133 "reset": true, 00:14:06.133 "nvme_admin": false, 00:14:06.133 "nvme_io": false, 00:14:06.133 "nvme_io_md": false, 00:14:06.133 "write_zeroes": true, 00:14:06.133 "zcopy": false, 00:14:06.133 "get_zone_info": false, 00:14:06.133 "zone_management": false, 00:14:06.133 "zone_append": false, 00:14:06.133 "compare": false, 00:14:06.133 "compare_and_write": false, 00:14:06.133 "abort": false, 00:14:06.133 "seek_hole": false, 00:14:06.133 "seek_data": false, 00:14:06.133 "copy": false, 00:14:06.133 "nvme_iov_md": false 00:14:06.133 }, 00:14:06.133 "memory_domains": [ 00:14:06.133 { 00:14:06.133 "dma_device_id": "system", 00:14:06.133 "dma_device_type": 1 00:14:06.133 }, 00:14:06.133 { 00:14:06.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.133 "dma_device_type": 2 00:14:06.133 }, 00:14:06.133 { 00:14:06.133 "dma_device_id": "system", 00:14:06.133 "dma_device_type": 1 00:14:06.133 }, 00:14:06.133 { 00:14:06.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.133 "dma_device_type": 2 00:14:06.133 } 00:14:06.133 ], 00:14:06.133 "driver_specific": { 00:14:06.133 "raid": { 00:14:06.133 "uuid": "7f3085e9-c07c-469c-a2e3-f07df26fdf61", 00:14:06.133 "strip_size_kb": 64, 00:14:06.133 "state": "online", 00:14:06.133 "raid_level": "concat", 00:14:06.133 "superblock": true, 00:14:06.133 "num_base_bdevs": 2, 00:14:06.133 "num_base_bdevs_discovered": 2, 00:14:06.133 "num_base_bdevs_operational": 2, 00:14:06.133 "base_bdevs_list": [ 00:14:06.133 { 00:14:06.133 "name": "pt1", 00:14:06.133 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:06.133 "is_configured": true, 00:14:06.133 "data_offset": 2048, 00:14:06.133 "data_size": 63488 00:14:06.133 }, 00:14:06.133 { 00:14:06.133 "name": "pt2", 00:14:06.133 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:06.133 "is_configured": true, 00:14:06.133 "data_offset": 2048, 00:14:06.133 "data_size": 63488 00:14:06.133 } 00:14:06.133 ] 00:14:06.133 } 00:14:06.133 } 00:14:06.133 }' 00:14:06.133 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:06.391 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:06.392 pt2' 00:14:06.392 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.392 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:06.392 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.650 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.650 "name": "pt1", 00:14:06.650 "aliases": [ 00:14:06.650 "00000000-0000-0000-0000-000000000001" 00:14:06.650 ], 00:14:06.650 "product_name": "passthru", 00:14:06.650 "block_size": 512, 00:14:06.650 "num_blocks": 65536, 00:14:06.650 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:06.650 "assigned_rate_limits": { 00:14:06.650 "rw_ios_per_sec": 0, 00:14:06.650 "rw_mbytes_per_sec": 0, 00:14:06.650 "r_mbytes_per_sec": 0, 00:14:06.650 "w_mbytes_per_sec": 0 00:14:06.650 }, 00:14:06.650 "claimed": true, 00:14:06.650 "claim_type": "exclusive_write", 00:14:06.650 "zoned": false, 00:14:06.650 "supported_io_types": { 00:14:06.650 "read": true, 00:14:06.650 "write": true, 00:14:06.650 "unmap": true, 00:14:06.650 "flush": true, 00:14:06.650 "reset": true, 00:14:06.650 "nvme_admin": false, 00:14:06.650 "nvme_io": false, 00:14:06.650 "nvme_io_md": false, 00:14:06.650 "write_zeroes": true, 00:14:06.650 "zcopy": true, 00:14:06.650 "get_zone_info": false, 00:14:06.650 "zone_management": false, 00:14:06.650 "zone_append": false, 00:14:06.650 "compare": false, 00:14:06.650 "compare_and_write": false, 00:14:06.650 "abort": true, 00:14:06.650 "seek_hole": false, 00:14:06.650 "seek_data": false, 00:14:06.650 "copy": true, 00:14:06.650 "nvme_iov_md": false 00:14:06.650 }, 00:14:06.650 "memory_domains": [ 00:14:06.650 { 00:14:06.650 "dma_device_id": "system", 00:14:06.650 "dma_device_type": 1 00:14:06.650 }, 00:14:06.650 { 00:14:06.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.650 "dma_device_type": 2 00:14:06.650 } 00:14:06.650 ], 00:14:06.650 "driver_specific": { 00:14:06.650 "passthru": { 00:14:06.650 "name": "pt1", 00:14:06.650 "base_bdev_name": "malloc1" 00:14:06.650 } 00:14:06.650 } 00:14:06.650 }' 00:14:06.650 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.650 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.650 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.650 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.650 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.650 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:06.650 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.650 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.909 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.909 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.909 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.909 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:06.909 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.909 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:06.909 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.168 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.168 "name": "pt2", 00:14:07.168 "aliases": [ 00:14:07.168 "00000000-0000-0000-0000-000000000002" 00:14:07.168 ], 00:14:07.168 "product_name": "passthru", 00:14:07.168 "block_size": 512, 00:14:07.168 "num_blocks": 65536, 00:14:07.168 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:07.168 "assigned_rate_limits": { 00:14:07.168 "rw_ios_per_sec": 0, 00:14:07.168 "rw_mbytes_per_sec": 0, 00:14:07.168 "r_mbytes_per_sec": 0, 00:14:07.168 "w_mbytes_per_sec": 0 00:14:07.168 }, 00:14:07.168 "claimed": true, 00:14:07.168 "claim_type": "exclusive_write", 00:14:07.168 "zoned": false, 00:14:07.168 "supported_io_types": { 00:14:07.168 "read": true, 00:14:07.168 "write": true, 00:14:07.168 "unmap": true, 00:14:07.168 "flush": true, 00:14:07.168 "reset": true, 00:14:07.168 "nvme_admin": false, 00:14:07.168 "nvme_io": false, 00:14:07.168 "nvme_io_md": false, 00:14:07.168 "write_zeroes": true, 00:14:07.168 "zcopy": true, 00:14:07.168 "get_zone_info": false, 00:14:07.168 "zone_management": false, 00:14:07.168 "zone_append": false, 00:14:07.168 "compare": false, 00:14:07.168 "compare_and_write": false, 00:14:07.168 "abort": true, 00:14:07.168 "seek_hole": false, 00:14:07.168 "seek_data": false, 00:14:07.168 "copy": true, 00:14:07.168 "nvme_iov_md": false 00:14:07.168 }, 00:14:07.168 "memory_domains": [ 00:14:07.168 { 00:14:07.168 "dma_device_id": "system", 00:14:07.168 "dma_device_type": 1 00:14:07.168 }, 00:14:07.168 { 00:14:07.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.168 "dma_device_type": 2 00:14:07.168 } 00:14:07.168 ], 00:14:07.168 "driver_specific": { 00:14:07.168 "passthru": { 00:14:07.168 "name": "pt2", 00:14:07.168 "base_bdev_name": "malloc2" 00:14:07.169 } 00:14:07.169 } 00:14:07.169 }' 00:14:07.169 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.169 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.169 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.169 05:42:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.169 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.427 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.427 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.427 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.427 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.427 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.427 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.427 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.427 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:07.427 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:07.686 [2024-07-26 05:42:22.487663] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:07.686 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7f3085e9-c07c-469c-a2e3-f07df26fdf61 00:14:07.686 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7f3085e9-c07c-469c-a2e3-f07df26fdf61 ']' 00:14:07.686 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:07.945 [2024-07-26 05:42:22.736082] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:07.945 [2024-07-26 05:42:22.736105] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:07.945 [2024-07-26 05:42:22.736166] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:07.945 [2024-07-26 05:42:22.736214] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:07.945 [2024-07-26 05:42:22.736226] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24f2270 name raid_bdev1, state offline 00:14:07.945 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.945 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:08.204 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:08.204 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:08.204 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:08.204 05:42:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:08.462 05:42:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:08.463 05:42:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:08.720 05:42:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:08.720 05:42:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:08.978 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:14:09.237 [2024-07-26 05:42:23.951360] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:09.237 [2024-07-26 05:42:23.952771] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:09.237 [2024-07-26 05:42:23.952827] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:09.237 [2024-07-26 05:42:23.952869] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:09.237 [2024-07-26 05:42:23.952887] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:09.237 [2024-07-26 05:42:23.952897] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24f1ff0 name raid_bdev1, state configuring 00:14:09.237 request: 00:14:09.237 { 00:14:09.237 "name": "raid_bdev1", 00:14:09.237 "raid_level": "concat", 00:14:09.237 "base_bdevs": [ 00:14:09.237 "malloc1", 00:14:09.237 "malloc2" 00:14:09.237 ], 00:14:09.237 "strip_size_kb": 64, 00:14:09.237 "superblock": false, 00:14:09.237 "method": "bdev_raid_create", 00:14:09.237 "req_id": 1 00:14:09.237 } 00:14:09.237 Got JSON-RPC error response 00:14:09.237 response: 00:14:09.237 { 00:14:09.237 "code": -17, 00:14:09.237 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:09.237 } 00:14:09.237 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:09.237 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:09.237 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:09.237 05:42:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:09.237 05:42:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.237 05:42:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:09.495 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:09.495 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:09.495 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:09.754 [2024-07-26 05:42:24.444585] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:09.754 [2024-07-26 05:42:24.444626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:09.754 [2024-07-26 05:42:24.444661] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x234e7a0 00:14:09.754 [2024-07-26 05:42:24.444675] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:09.754 [2024-07-26 05:42:24.446275] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:09.754 [2024-07-26 05:42:24.446304] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:09.754 [2024-07-26 05:42:24.446371] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:09.754 [2024-07-26 05:42:24.446399] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:09.754 pt1 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.754 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:10.013 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.013 "name": "raid_bdev1", 00:14:10.013 "uuid": "7f3085e9-c07c-469c-a2e3-f07df26fdf61", 00:14:10.013 "strip_size_kb": 64, 00:14:10.013 "state": "configuring", 00:14:10.013 "raid_level": "concat", 00:14:10.013 "superblock": true, 00:14:10.013 "num_base_bdevs": 2, 00:14:10.013 "num_base_bdevs_discovered": 1, 00:14:10.013 "num_base_bdevs_operational": 2, 00:14:10.013 "base_bdevs_list": [ 00:14:10.013 { 00:14:10.013 "name": "pt1", 00:14:10.013 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:10.013 "is_configured": true, 00:14:10.013 "data_offset": 2048, 00:14:10.013 "data_size": 63488 00:14:10.013 }, 00:14:10.013 { 00:14:10.013 "name": null, 00:14:10.013 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:10.013 "is_configured": false, 00:14:10.013 "data_offset": 2048, 00:14:10.013 "data_size": 63488 00:14:10.013 } 00:14:10.013 ] 00:14:10.013 }' 00:14:10.013 05:42:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.013 05:42:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.578 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:14:10.578 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:10.578 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:10.578 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:10.836 [2024-07-26 05:42:25.527463] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:10.836 [2024-07-26 05:42:25.527511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:10.836 [2024-07-26 05:42:25.527529] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24e8820 00:14:10.836 [2024-07-26 05:42:25.527541] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:10.836 [2024-07-26 05:42:25.527910] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:10.836 [2024-07-26 05:42:25.527930] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:10.836 [2024-07-26 05:42:25.527995] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:10.836 [2024-07-26 05:42:25.528016] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:10.836 [2024-07-26 05:42:25.528112] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2344ec0 00:14:10.836 [2024-07-26 05:42:25.528122] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:10.836 [2024-07-26 05:42:25.528296] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2345f00 00:14:10.836 [2024-07-26 05:42:25.528422] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2344ec0 00:14:10.836 [2024-07-26 05:42:25.528432] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2344ec0 00:14:10.836 [2024-07-26 05:42:25.528532] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:10.836 pt2 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.836 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.837 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.837 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.837 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:10.837 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.095 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.095 "name": "raid_bdev1", 00:14:11.095 "uuid": "7f3085e9-c07c-469c-a2e3-f07df26fdf61", 00:14:11.095 "strip_size_kb": 64, 00:14:11.095 "state": "online", 00:14:11.095 "raid_level": "concat", 00:14:11.095 "superblock": true, 00:14:11.095 "num_base_bdevs": 2, 00:14:11.095 "num_base_bdevs_discovered": 2, 00:14:11.095 "num_base_bdevs_operational": 2, 00:14:11.095 "base_bdevs_list": [ 00:14:11.095 { 00:14:11.095 "name": "pt1", 00:14:11.095 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:11.095 "is_configured": true, 00:14:11.095 "data_offset": 2048, 00:14:11.095 "data_size": 63488 00:14:11.095 }, 00:14:11.095 { 00:14:11.095 "name": "pt2", 00:14:11.095 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:11.095 "is_configured": true, 00:14:11.095 "data_offset": 2048, 00:14:11.095 "data_size": 63488 00:14:11.095 } 00:14:11.095 ] 00:14:11.095 }' 00:14:11.095 05:42:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.095 05:42:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.661 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:11.661 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:11.661 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:11.661 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:11.661 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:11.661 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:11.661 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:11.661 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:11.919 [2024-07-26 05:42:26.622699] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:11.919 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:11.919 "name": "raid_bdev1", 00:14:11.919 "aliases": [ 00:14:11.919 "7f3085e9-c07c-469c-a2e3-f07df26fdf61" 00:14:11.919 ], 00:14:11.919 "product_name": "Raid Volume", 00:14:11.919 "block_size": 512, 00:14:11.919 "num_blocks": 126976, 00:14:11.919 "uuid": "7f3085e9-c07c-469c-a2e3-f07df26fdf61", 00:14:11.919 "assigned_rate_limits": { 00:14:11.919 "rw_ios_per_sec": 0, 00:14:11.919 "rw_mbytes_per_sec": 0, 00:14:11.919 "r_mbytes_per_sec": 0, 00:14:11.919 "w_mbytes_per_sec": 0 00:14:11.919 }, 00:14:11.919 "claimed": false, 00:14:11.919 "zoned": false, 00:14:11.919 "supported_io_types": { 00:14:11.919 "read": true, 00:14:11.919 "write": true, 00:14:11.919 "unmap": true, 00:14:11.919 "flush": true, 00:14:11.919 "reset": true, 00:14:11.919 "nvme_admin": false, 00:14:11.919 "nvme_io": false, 00:14:11.919 "nvme_io_md": false, 00:14:11.919 "write_zeroes": true, 00:14:11.919 "zcopy": false, 00:14:11.919 "get_zone_info": false, 00:14:11.919 "zone_management": false, 00:14:11.919 "zone_append": false, 00:14:11.919 "compare": false, 00:14:11.919 "compare_and_write": false, 00:14:11.919 "abort": false, 00:14:11.919 "seek_hole": false, 00:14:11.919 "seek_data": false, 00:14:11.919 "copy": false, 00:14:11.919 "nvme_iov_md": false 00:14:11.919 }, 00:14:11.919 "memory_domains": [ 00:14:11.919 { 00:14:11.919 "dma_device_id": "system", 00:14:11.919 "dma_device_type": 1 00:14:11.919 }, 00:14:11.919 { 00:14:11.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.919 "dma_device_type": 2 00:14:11.919 }, 00:14:11.919 { 00:14:11.919 "dma_device_id": "system", 00:14:11.919 "dma_device_type": 1 00:14:11.919 }, 00:14:11.919 { 00:14:11.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.919 "dma_device_type": 2 00:14:11.919 } 00:14:11.919 ], 00:14:11.919 "driver_specific": { 00:14:11.919 "raid": { 00:14:11.919 "uuid": "7f3085e9-c07c-469c-a2e3-f07df26fdf61", 00:14:11.919 "strip_size_kb": 64, 00:14:11.919 "state": "online", 00:14:11.919 "raid_level": "concat", 00:14:11.919 "superblock": true, 00:14:11.919 "num_base_bdevs": 2, 00:14:11.919 "num_base_bdevs_discovered": 2, 00:14:11.919 "num_base_bdevs_operational": 2, 00:14:11.919 "base_bdevs_list": [ 00:14:11.919 { 00:14:11.919 "name": "pt1", 00:14:11.919 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:11.919 "is_configured": true, 00:14:11.919 "data_offset": 2048, 00:14:11.919 "data_size": 63488 00:14:11.919 }, 00:14:11.919 { 00:14:11.919 "name": "pt2", 00:14:11.919 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:11.919 "is_configured": true, 00:14:11.919 "data_offset": 2048, 00:14:11.919 "data_size": 63488 00:14:11.919 } 00:14:11.919 ] 00:14:11.919 } 00:14:11.919 } 00:14:11.919 }' 00:14:11.919 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:11.919 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:11.919 pt2' 00:14:11.919 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:11.919 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:11.919 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.177 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.177 "name": "pt1", 00:14:12.177 "aliases": [ 00:14:12.177 "00000000-0000-0000-0000-000000000001" 00:14:12.177 ], 00:14:12.177 "product_name": "passthru", 00:14:12.177 "block_size": 512, 00:14:12.177 "num_blocks": 65536, 00:14:12.177 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:12.177 "assigned_rate_limits": { 00:14:12.177 "rw_ios_per_sec": 0, 00:14:12.177 "rw_mbytes_per_sec": 0, 00:14:12.177 "r_mbytes_per_sec": 0, 00:14:12.177 "w_mbytes_per_sec": 0 00:14:12.177 }, 00:14:12.177 "claimed": true, 00:14:12.177 "claim_type": "exclusive_write", 00:14:12.177 "zoned": false, 00:14:12.177 "supported_io_types": { 00:14:12.177 "read": true, 00:14:12.177 "write": true, 00:14:12.177 "unmap": true, 00:14:12.177 "flush": true, 00:14:12.177 "reset": true, 00:14:12.177 "nvme_admin": false, 00:14:12.177 "nvme_io": false, 00:14:12.177 "nvme_io_md": false, 00:14:12.177 "write_zeroes": true, 00:14:12.177 "zcopy": true, 00:14:12.177 "get_zone_info": false, 00:14:12.177 "zone_management": false, 00:14:12.177 "zone_append": false, 00:14:12.177 "compare": false, 00:14:12.177 "compare_and_write": false, 00:14:12.177 "abort": true, 00:14:12.177 "seek_hole": false, 00:14:12.177 "seek_data": false, 00:14:12.177 "copy": true, 00:14:12.177 "nvme_iov_md": false 00:14:12.177 }, 00:14:12.177 "memory_domains": [ 00:14:12.177 { 00:14:12.177 "dma_device_id": "system", 00:14:12.177 "dma_device_type": 1 00:14:12.177 }, 00:14:12.177 { 00:14:12.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.177 "dma_device_type": 2 00:14:12.177 } 00:14:12.177 ], 00:14:12.177 "driver_specific": { 00:14:12.177 "passthru": { 00:14:12.177 "name": "pt1", 00:14:12.177 "base_bdev_name": "malloc1" 00:14:12.177 } 00:14:12.177 } 00:14:12.177 }' 00:14:12.177 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.177 05:42:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.177 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.177 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.177 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:12.435 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.693 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.693 "name": "pt2", 00:14:12.693 "aliases": [ 00:14:12.693 "00000000-0000-0000-0000-000000000002" 00:14:12.693 ], 00:14:12.693 "product_name": "passthru", 00:14:12.693 "block_size": 512, 00:14:12.693 "num_blocks": 65536, 00:14:12.693 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:12.693 "assigned_rate_limits": { 00:14:12.693 "rw_ios_per_sec": 0, 00:14:12.693 "rw_mbytes_per_sec": 0, 00:14:12.693 "r_mbytes_per_sec": 0, 00:14:12.693 "w_mbytes_per_sec": 0 00:14:12.693 }, 00:14:12.693 "claimed": true, 00:14:12.693 "claim_type": "exclusive_write", 00:14:12.693 "zoned": false, 00:14:12.693 "supported_io_types": { 00:14:12.693 "read": true, 00:14:12.693 "write": true, 00:14:12.693 "unmap": true, 00:14:12.693 "flush": true, 00:14:12.693 "reset": true, 00:14:12.693 "nvme_admin": false, 00:14:12.693 "nvme_io": false, 00:14:12.693 "nvme_io_md": false, 00:14:12.693 "write_zeroes": true, 00:14:12.693 "zcopy": true, 00:14:12.693 "get_zone_info": false, 00:14:12.693 "zone_management": false, 00:14:12.693 "zone_append": false, 00:14:12.693 "compare": false, 00:14:12.693 "compare_and_write": false, 00:14:12.693 "abort": true, 00:14:12.693 "seek_hole": false, 00:14:12.693 "seek_data": false, 00:14:12.693 "copy": true, 00:14:12.693 "nvme_iov_md": false 00:14:12.693 }, 00:14:12.693 "memory_domains": [ 00:14:12.693 { 00:14:12.693 "dma_device_id": "system", 00:14:12.693 "dma_device_type": 1 00:14:12.693 }, 00:14:12.693 { 00:14:12.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.693 "dma_device_type": 2 00:14:12.693 } 00:14:12.693 ], 00:14:12.693 "driver_specific": { 00:14:12.693 "passthru": { 00:14:12.693 "name": "pt2", 00:14:12.693 "base_bdev_name": "malloc2" 00:14:12.693 } 00:14:12.693 } 00:14:12.693 }' 00:14:12.693 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.693 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.952 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.952 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.952 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.952 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.952 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.952 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.952 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.952 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.952 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.210 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.210 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:13.210 05:42:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:13.469 [2024-07-26 05:42:28.122673] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7f3085e9-c07c-469c-a2e3-f07df26fdf61 '!=' 7f3085e9-c07c-469c-a2e3-f07df26fdf61 ']' 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1136590 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1136590 ']' 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1136590 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1136590 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1136590' 00:14:13.469 killing process with pid 1136590 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1136590 00:14:13.469 [2024-07-26 05:42:28.196966] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:13.469 [2024-07-26 05:42:28.197023] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:13.469 [2024-07-26 05:42:28.197066] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:13.469 [2024-07-26 05:42:28.197078] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2344ec0 name raid_bdev1, state offline 00:14:13.469 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1136590 00:14:13.469 [2024-07-26 05:42:28.214450] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:13.728 05:42:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:13.728 00:14:13.728 real 0m10.777s 00:14:13.728 user 0m19.218s 00:14:13.728 sys 0m2.012s 00:14:13.728 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:13.728 05:42:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.728 ************************************ 00:14:13.728 END TEST raid_superblock_test 00:14:13.728 ************************************ 00:14:13.728 05:42:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:13.728 05:42:28 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:14:13.728 05:42:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:13.728 05:42:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:13.728 05:42:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:13.728 ************************************ 00:14:13.728 START TEST raid_read_error_test 00:14:13.728 ************************************ 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.6SiFSjJoVk 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1138220 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1138220 /var/tmp/spdk-raid.sock 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1138220 ']' 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:13.728 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:13.728 05:42:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.728 [2024-07-26 05:42:28.597315] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:14:13.728 [2024-07-26 05:42:28.597377] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1138220 ] 00:14:13.986 [2024-07-26 05:42:28.727331] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:13.986 [2024-07-26 05:42:28.838391] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:14.244 [2024-07-26 05:42:28.900510] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:14.244 [2024-07-26 05:42:28.900541] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:14.244 05:42:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:14.244 05:42:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:14.244 05:42:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:14.244 05:42:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:14.503 BaseBdev1_malloc 00:14:14.503 05:42:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:14.761 true 00:14:14.761 05:42:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:15.020 [2024-07-26 05:42:29.778259] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:15.020 [2024-07-26 05:42:29.778305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.020 [2024-07-26 05:42:29.778327] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d0b0d0 00:14:15.020 [2024-07-26 05:42:29.778340] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.020 [2024-07-26 05:42:29.780223] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.020 [2024-07-26 05:42:29.780253] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:15.020 BaseBdev1 00:14:15.020 05:42:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:15.020 05:42:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:15.278 BaseBdev2_malloc 00:14:15.278 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:15.536 true 00:14:15.536 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:15.794 [2024-07-26 05:42:30.522086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:15.794 [2024-07-26 05:42:30.522130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.794 [2024-07-26 05:42:30.522152] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d0f910 00:14:15.794 [2024-07-26 05:42:30.522164] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.794 [2024-07-26 05:42:30.523731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.794 [2024-07-26 05:42:30.523759] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:15.794 BaseBdev2 00:14:15.794 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:14:16.053 [2024-07-26 05:42:30.762760] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:16.053 [2024-07-26 05:42:30.764097] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:16.053 [2024-07-26 05:42:30.764290] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d11320 00:14:16.053 [2024-07-26 05:42:30.764309] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:16.053 [2024-07-26 05:42:30.764507] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d12290 00:14:16.053 [2024-07-26 05:42:30.764662] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d11320 00:14:16.053 [2024-07-26 05:42:30.764673] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d11320 00:14:16.053 [2024-07-26 05:42:30.764782] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.053 05:42:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:16.352 05:42:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.352 "name": "raid_bdev1", 00:14:16.352 "uuid": "859dbcac-bb2d-4f09-810c-57005a916711", 00:14:16.352 "strip_size_kb": 64, 00:14:16.352 "state": "online", 00:14:16.352 "raid_level": "concat", 00:14:16.352 "superblock": true, 00:14:16.352 "num_base_bdevs": 2, 00:14:16.352 "num_base_bdevs_discovered": 2, 00:14:16.352 "num_base_bdevs_operational": 2, 00:14:16.352 "base_bdevs_list": [ 00:14:16.352 { 00:14:16.352 "name": "BaseBdev1", 00:14:16.352 "uuid": "920d3e6c-2293-5476-b901-3000a722fee1", 00:14:16.352 "is_configured": true, 00:14:16.352 "data_offset": 2048, 00:14:16.352 "data_size": 63488 00:14:16.352 }, 00:14:16.352 { 00:14:16.352 "name": "BaseBdev2", 00:14:16.352 "uuid": "b60ddfae-232e-5d03-85f2-9ece0f9ce270", 00:14:16.352 "is_configured": true, 00:14:16.352 "data_offset": 2048, 00:14:16.352 "data_size": 63488 00:14:16.352 } 00:14:16.352 ] 00:14:16.352 }' 00:14:16.352 05:42:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.352 05:42:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:16.937 05:42:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:16.937 05:42:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:16.937 [2024-07-26 05:42:31.713555] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d0c9b0 00:14:17.873 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.130 "name": "raid_bdev1", 00:14:18.130 "uuid": "859dbcac-bb2d-4f09-810c-57005a916711", 00:14:18.130 "strip_size_kb": 64, 00:14:18.130 "state": "online", 00:14:18.130 "raid_level": "concat", 00:14:18.130 "superblock": true, 00:14:18.130 "num_base_bdevs": 2, 00:14:18.130 "num_base_bdevs_discovered": 2, 00:14:18.130 "num_base_bdevs_operational": 2, 00:14:18.130 "base_bdevs_list": [ 00:14:18.130 { 00:14:18.130 "name": "BaseBdev1", 00:14:18.130 "uuid": "920d3e6c-2293-5476-b901-3000a722fee1", 00:14:18.130 "is_configured": true, 00:14:18.130 "data_offset": 2048, 00:14:18.130 "data_size": 63488 00:14:18.130 }, 00:14:18.130 { 00:14:18.130 "name": "BaseBdev2", 00:14:18.130 "uuid": "b60ddfae-232e-5d03-85f2-9ece0f9ce270", 00:14:18.130 "is_configured": true, 00:14:18.130 "data_offset": 2048, 00:14:18.130 "data_size": 63488 00:14:18.130 } 00:14:18.130 ] 00:14:18.130 }' 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.130 05:42:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.696 05:42:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:18.954 [2024-07-26 05:42:33.755799] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:18.954 [2024-07-26 05:42:33.755829] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:18.954 [2024-07-26 05:42:33.758993] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:18.954 [2024-07-26 05:42:33.759022] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:18.954 [2024-07-26 05:42:33.759049] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:18.954 [2024-07-26 05:42:33.759060] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d11320 name raid_bdev1, state offline 00:14:18.954 0 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1138220 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1138220 ']' 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1138220 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1138220 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1138220' 00:14:18.954 killing process with pid 1138220 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1138220 00:14:18.954 [2024-07-26 05:42:33.823240] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:18.954 05:42:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1138220 00:14:18.954 [2024-07-26 05:42:33.833873] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.6SiFSjJoVk 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:14:19.213 00:14:19.213 real 0m5.552s 00:14:19.213 user 0m8.889s 00:14:19.213 sys 0m1.077s 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:19.213 05:42:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.213 ************************************ 00:14:19.213 END TEST raid_read_error_test 00:14:19.213 ************************************ 00:14:19.213 05:42:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:19.213 05:42:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:14:19.213 05:42:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:19.213 05:42:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:19.213 05:42:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:19.471 ************************************ 00:14:19.471 START TEST raid_write_error_test 00:14:19.471 ************************************ 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:19.471 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.sFAZqaz2VE 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1139025 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1139025 /var/tmp/spdk-raid.sock 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1139025 ']' 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:19.472 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:19.472 05:42:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.472 [2024-07-26 05:42:34.231095] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:14:19.472 [2024-07-26 05:42:34.231162] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1139025 ] 00:14:19.472 [2024-07-26 05:42:34.362318] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:19.731 [2024-07-26 05:42:34.471612] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:19.731 [2024-07-26 05:42:34.543343] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:19.731 [2024-07-26 05:42:34.543383] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:20.297 05:42:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:20.297 05:42:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:20.297 05:42:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:20.297 05:42:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:20.556 BaseBdev1_malloc 00:14:20.556 05:42:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:20.814 true 00:14:20.814 05:42:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:21.072 [2024-07-26 05:42:35.846513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:21.072 [2024-07-26 05:42:35.846557] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:21.072 [2024-07-26 05:42:35.846578] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x264d0d0 00:14:21.072 [2024-07-26 05:42:35.846590] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:21.072 [2024-07-26 05:42:35.848455] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:21.072 [2024-07-26 05:42:35.848484] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:21.072 BaseBdev1 00:14:21.072 05:42:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:21.072 05:42:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:21.330 BaseBdev2_malloc 00:14:21.330 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:21.588 true 00:14:21.588 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:21.845 [2024-07-26 05:42:36.594423] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:21.845 [2024-07-26 05:42:36.594471] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:21.845 [2024-07-26 05:42:36.594492] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2651910 00:14:21.845 [2024-07-26 05:42:36.594504] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:21.845 [2024-07-26 05:42:36.596065] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:21.845 [2024-07-26 05:42:36.596092] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:21.845 BaseBdev2 00:14:21.845 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:14:22.103 [2024-07-26 05:42:36.835090] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:22.103 [2024-07-26 05:42:36.836414] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:22.103 [2024-07-26 05:42:36.836605] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2653320 00:14:22.103 [2024-07-26 05:42:36.836619] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:14:22.103 [2024-07-26 05:42:36.836827] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2654290 00:14:22.103 [2024-07-26 05:42:36.836973] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2653320 00:14:22.103 [2024-07-26 05:42:36.836983] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2653320 00:14:22.103 [2024-07-26 05:42:36.837087] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.103 05:42:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:22.362 05:42:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.362 "name": "raid_bdev1", 00:14:22.362 "uuid": "ded3afd1-9845-4f66-8e6c-ce6cdea9a063", 00:14:22.362 "strip_size_kb": 64, 00:14:22.362 "state": "online", 00:14:22.362 "raid_level": "concat", 00:14:22.362 "superblock": true, 00:14:22.362 "num_base_bdevs": 2, 00:14:22.362 "num_base_bdevs_discovered": 2, 00:14:22.362 "num_base_bdevs_operational": 2, 00:14:22.362 "base_bdevs_list": [ 00:14:22.362 { 00:14:22.362 "name": "BaseBdev1", 00:14:22.362 "uuid": "f220371a-2482-5412-85df-bbc62112efb5", 00:14:22.362 "is_configured": true, 00:14:22.362 "data_offset": 2048, 00:14:22.362 "data_size": 63488 00:14:22.362 }, 00:14:22.362 { 00:14:22.362 "name": "BaseBdev2", 00:14:22.362 "uuid": "61c1185b-4321-52bc-a292-21f1b48ec775", 00:14:22.362 "is_configured": true, 00:14:22.362 "data_offset": 2048, 00:14:22.362 "data_size": 63488 00:14:22.362 } 00:14:22.362 ] 00:14:22.362 }' 00:14:22.362 05:42:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.362 05:42:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.928 05:42:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:22.928 05:42:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:22.928 [2024-07-26 05:42:37.757809] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x264e9b0 00:14:23.863 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.121 05:42:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:24.380 05:42:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.380 "name": "raid_bdev1", 00:14:24.380 "uuid": "ded3afd1-9845-4f66-8e6c-ce6cdea9a063", 00:14:24.380 "strip_size_kb": 64, 00:14:24.380 "state": "online", 00:14:24.380 "raid_level": "concat", 00:14:24.380 "superblock": true, 00:14:24.380 "num_base_bdevs": 2, 00:14:24.380 "num_base_bdevs_discovered": 2, 00:14:24.380 "num_base_bdevs_operational": 2, 00:14:24.380 "base_bdevs_list": [ 00:14:24.380 { 00:14:24.380 "name": "BaseBdev1", 00:14:24.380 "uuid": "f220371a-2482-5412-85df-bbc62112efb5", 00:14:24.380 "is_configured": true, 00:14:24.380 "data_offset": 2048, 00:14:24.380 "data_size": 63488 00:14:24.380 }, 00:14:24.380 { 00:14:24.380 "name": "BaseBdev2", 00:14:24.380 "uuid": "61c1185b-4321-52bc-a292-21f1b48ec775", 00:14:24.380 "is_configured": true, 00:14:24.380 "data_offset": 2048, 00:14:24.380 "data_size": 63488 00:14:24.380 } 00:14:24.380 ] 00:14:24.380 }' 00:14:24.380 05:42:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.380 05:42:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:24.947 05:42:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:25.206 [2024-07-26 05:42:39.938920] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:25.206 [2024-07-26 05:42:39.938957] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:25.206 [2024-07-26 05:42:39.942136] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:25.206 [2024-07-26 05:42:39.942166] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:25.206 [2024-07-26 05:42:39.942193] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:25.206 [2024-07-26 05:42:39.942204] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2653320 name raid_bdev1, state offline 00:14:25.206 0 00:14:25.206 05:42:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1139025 00:14:25.206 05:42:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1139025 ']' 00:14:25.206 05:42:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1139025 00:14:25.206 05:42:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:25.206 05:42:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:25.206 05:42:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1139025 00:14:25.206 05:42:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:25.206 05:42:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:25.206 05:42:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1139025' 00:14:25.206 killing process with pid 1139025 00:14:25.206 05:42:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1139025 00:14:25.206 [2024-07-26 05:42:40.011169] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:25.206 05:42:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1139025 00:14:25.206 [2024-07-26 05:42:40.023400] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.sFAZqaz2VE 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:14:25.465 00:14:25.465 real 0m6.102s 00:14:25.465 user 0m9.519s 00:14:25.465 sys 0m1.063s 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:25.465 05:42:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.465 ************************************ 00:14:25.465 END TEST raid_write_error_test 00:14:25.465 ************************************ 00:14:25.465 05:42:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:25.465 05:42:40 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:25.465 05:42:40 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:14:25.465 05:42:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:25.465 05:42:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:25.465 05:42:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:25.465 ************************************ 00:14:25.465 START TEST raid_state_function_test 00:14:25.465 ************************************ 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1139991 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1139991' 00:14:25.465 Process raid pid: 1139991 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1139991 /var/tmp/spdk-raid.sock 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1139991 ']' 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:25.465 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:25.465 05:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.724 [2024-07-26 05:42:40.414046] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:14:25.724 [2024-07-26 05:42:40.414110] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:25.724 [2024-07-26 05:42:40.546068] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:25.982 [2024-07-26 05:42:40.653281] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:25.982 [2024-07-26 05:42:40.720630] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:25.982 [2024-07-26 05:42:40.720679] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:26.548 05:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:26.548 05:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:26.548 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:26.807 [2024-07-26 05:42:41.567913] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:26.807 [2024-07-26 05:42:41.567955] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:26.807 [2024-07-26 05:42:41.567966] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:26.807 [2024-07-26 05:42:41.567979] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.807 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:27.066 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:27.066 "name": "Existed_Raid", 00:14:27.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.066 "strip_size_kb": 0, 00:14:27.066 "state": "configuring", 00:14:27.066 "raid_level": "raid1", 00:14:27.066 "superblock": false, 00:14:27.066 "num_base_bdevs": 2, 00:14:27.066 "num_base_bdevs_discovered": 0, 00:14:27.066 "num_base_bdevs_operational": 2, 00:14:27.066 "base_bdevs_list": [ 00:14:27.066 { 00:14:27.066 "name": "BaseBdev1", 00:14:27.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.066 "is_configured": false, 00:14:27.066 "data_offset": 0, 00:14:27.066 "data_size": 0 00:14:27.066 }, 00:14:27.066 { 00:14:27.066 "name": "BaseBdev2", 00:14:27.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.066 "is_configured": false, 00:14:27.066 "data_offset": 0, 00:14:27.066 "data_size": 0 00:14:27.066 } 00:14:27.066 ] 00:14:27.066 }' 00:14:27.066 05:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:27.066 05:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.633 05:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:27.891 [2024-07-26 05:42:42.630597] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:27.891 [2024-07-26 05:42:42.630625] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x820a80 name Existed_Raid, state configuring 00:14:27.891 05:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:28.149 [2024-07-26 05:42:42.811088] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:28.149 [2024-07-26 05:42:42.811117] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:28.149 [2024-07-26 05:42:42.811127] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:28.149 [2024-07-26 05:42:42.811139] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:28.149 05:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:28.408 [2024-07-26 05:42:43.065486] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:28.408 BaseBdev1 00:14:28.408 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:28.408 05:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:28.408 05:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:28.408 05:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:28.408 05:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:28.408 05:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:28.408 05:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:28.677 05:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:28.677 [ 00:14:28.677 { 00:14:28.677 "name": "BaseBdev1", 00:14:28.677 "aliases": [ 00:14:28.677 "690b7612-6c41-48a1-b744-b8b885d1a283" 00:14:28.677 ], 00:14:28.677 "product_name": "Malloc disk", 00:14:28.677 "block_size": 512, 00:14:28.677 "num_blocks": 65536, 00:14:28.677 "uuid": "690b7612-6c41-48a1-b744-b8b885d1a283", 00:14:28.677 "assigned_rate_limits": { 00:14:28.677 "rw_ios_per_sec": 0, 00:14:28.677 "rw_mbytes_per_sec": 0, 00:14:28.677 "r_mbytes_per_sec": 0, 00:14:28.677 "w_mbytes_per_sec": 0 00:14:28.677 }, 00:14:28.677 "claimed": true, 00:14:28.677 "claim_type": "exclusive_write", 00:14:28.677 "zoned": false, 00:14:28.677 "supported_io_types": { 00:14:28.677 "read": true, 00:14:28.677 "write": true, 00:14:28.677 "unmap": true, 00:14:28.677 "flush": true, 00:14:28.677 "reset": true, 00:14:28.677 "nvme_admin": false, 00:14:28.677 "nvme_io": false, 00:14:28.677 "nvme_io_md": false, 00:14:28.677 "write_zeroes": true, 00:14:28.677 "zcopy": true, 00:14:28.677 "get_zone_info": false, 00:14:28.678 "zone_management": false, 00:14:28.678 "zone_append": false, 00:14:28.678 "compare": false, 00:14:28.678 "compare_and_write": false, 00:14:28.678 "abort": true, 00:14:28.678 "seek_hole": false, 00:14:28.678 "seek_data": false, 00:14:28.678 "copy": true, 00:14:28.678 "nvme_iov_md": false 00:14:28.678 }, 00:14:28.678 "memory_domains": [ 00:14:28.678 { 00:14:28.678 "dma_device_id": "system", 00:14:28.678 "dma_device_type": 1 00:14:28.678 }, 00:14:28.678 { 00:14:28.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.678 "dma_device_type": 2 00:14:28.678 } 00:14:28.678 ], 00:14:28.678 "driver_specific": {} 00:14:28.678 } 00:14:28.678 ] 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.678 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.935 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.935 "name": "Existed_Raid", 00:14:28.935 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:28.935 "strip_size_kb": 0, 00:14:28.935 "state": "configuring", 00:14:28.935 "raid_level": "raid1", 00:14:28.935 "superblock": false, 00:14:28.935 "num_base_bdevs": 2, 00:14:28.935 "num_base_bdevs_discovered": 1, 00:14:28.935 "num_base_bdevs_operational": 2, 00:14:28.935 "base_bdevs_list": [ 00:14:28.935 { 00:14:28.935 "name": "BaseBdev1", 00:14:28.935 "uuid": "690b7612-6c41-48a1-b744-b8b885d1a283", 00:14:28.935 "is_configured": true, 00:14:28.935 "data_offset": 0, 00:14:28.935 "data_size": 65536 00:14:28.935 }, 00:14:28.935 { 00:14:28.935 "name": "BaseBdev2", 00:14:28.935 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:28.935 "is_configured": false, 00:14:28.935 "data_offset": 0, 00:14:28.935 "data_size": 0 00:14:28.935 } 00:14:28.935 ] 00:14:28.935 }' 00:14:28.935 05:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.935 05:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.501 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:29.759 [2024-07-26 05:42:44.625619] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:29.759 [2024-07-26 05:42:44.625662] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x820350 name Existed_Raid, state configuring 00:14:29.759 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:30.017 [2024-07-26 05:42:44.870280] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:30.017 [2024-07-26 05:42:44.871755] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:30.017 [2024-07-26 05:42:44.871785] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.017 05:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:30.276 05:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.276 "name": "Existed_Raid", 00:14:30.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.276 "strip_size_kb": 0, 00:14:30.276 "state": "configuring", 00:14:30.276 "raid_level": "raid1", 00:14:30.276 "superblock": false, 00:14:30.276 "num_base_bdevs": 2, 00:14:30.276 "num_base_bdevs_discovered": 1, 00:14:30.276 "num_base_bdevs_operational": 2, 00:14:30.276 "base_bdevs_list": [ 00:14:30.276 { 00:14:30.276 "name": "BaseBdev1", 00:14:30.276 "uuid": "690b7612-6c41-48a1-b744-b8b885d1a283", 00:14:30.276 "is_configured": true, 00:14:30.276 "data_offset": 0, 00:14:30.276 "data_size": 65536 00:14:30.276 }, 00:14:30.276 { 00:14:30.276 "name": "BaseBdev2", 00:14:30.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.276 "is_configured": false, 00:14:30.276 "data_offset": 0, 00:14:30.276 "data_size": 0 00:14:30.276 } 00:14:30.276 ] 00:14:30.276 }' 00:14:30.276 05:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.276 05:42:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:30.866 05:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:31.176 [2024-07-26 05:42:45.980579] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:31.176 [2024-07-26 05:42:45.980616] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x821000 00:14:31.176 [2024-07-26 05:42:45.980625] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:14:31.176 [2024-07-26 05:42:45.980826] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x73b0c0 00:14:31.176 [2024-07-26 05:42:45.980948] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x821000 00:14:31.176 [2024-07-26 05:42:45.980958] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x821000 00:14:31.176 [2024-07-26 05:42:45.981123] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:31.176 BaseBdev2 00:14:31.176 05:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:31.176 05:42:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:31.176 05:42:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:31.176 05:42:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:31.176 05:42:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:31.176 05:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:31.176 05:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:31.434 05:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:31.692 [ 00:14:31.692 { 00:14:31.692 "name": "BaseBdev2", 00:14:31.692 "aliases": [ 00:14:31.692 "8ff00ee8-61c9-4f02-ba15-c0d4282b410a" 00:14:31.692 ], 00:14:31.692 "product_name": "Malloc disk", 00:14:31.692 "block_size": 512, 00:14:31.692 "num_blocks": 65536, 00:14:31.692 "uuid": "8ff00ee8-61c9-4f02-ba15-c0d4282b410a", 00:14:31.692 "assigned_rate_limits": { 00:14:31.692 "rw_ios_per_sec": 0, 00:14:31.692 "rw_mbytes_per_sec": 0, 00:14:31.692 "r_mbytes_per_sec": 0, 00:14:31.692 "w_mbytes_per_sec": 0 00:14:31.692 }, 00:14:31.692 "claimed": true, 00:14:31.692 "claim_type": "exclusive_write", 00:14:31.692 "zoned": false, 00:14:31.692 "supported_io_types": { 00:14:31.692 "read": true, 00:14:31.692 "write": true, 00:14:31.692 "unmap": true, 00:14:31.692 "flush": true, 00:14:31.692 "reset": true, 00:14:31.692 "nvme_admin": false, 00:14:31.692 "nvme_io": false, 00:14:31.692 "nvme_io_md": false, 00:14:31.692 "write_zeroes": true, 00:14:31.692 "zcopy": true, 00:14:31.693 "get_zone_info": false, 00:14:31.693 "zone_management": false, 00:14:31.693 "zone_append": false, 00:14:31.693 "compare": false, 00:14:31.693 "compare_and_write": false, 00:14:31.693 "abort": true, 00:14:31.693 "seek_hole": false, 00:14:31.693 "seek_data": false, 00:14:31.693 "copy": true, 00:14:31.693 "nvme_iov_md": false 00:14:31.693 }, 00:14:31.693 "memory_domains": [ 00:14:31.693 { 00:14:31.693 "dma_device_id": "system", 00:14:31.693 "dma_device_type": 1 00:14:31.693 }, 00:14:31.693 { 00:14:31.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.693 "dma_device_type": 2 00:14:31.693 } 00:14:31.693 ], 00:14:31.693 "driver_specific": {} 00:14:31.693 } 00:14:31.693 ] 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.693 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.951 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.951 "name": "Existed_Raid", 00:14:31.951 "uuid": "97912419-c22e-4004-a3d4-52c7806e696b", 00:14:31.951 "strip_size_kb": 0, 00:14:31.951 "state": "online", 00:14:31.951 "raid_level": "raid1", 00:14:31.951 "superblock": false, 00:14:31.951 "num_base_bdevs": 2, 00:14:31.951 "num_base_bdevs_discovered": 2, 00:14:31.951 "num_base_bdevs_operational": 2, 00:14:31.951 "base_bdevs_list": [ 00:14:31.951 { 00:14:31.951 "name": "BaseBdev1", 00:14:31.951 "uuid": "690b7612-6c41-48a1-b744-b8b885d1a283", 00:14:31.951 "is_configured": true, 00:14:31.951 "data_offset": 0, 00:14:31.951 "data_size": 65536 00:14:31.951 }, 00:14:31.951 { 00:14:31.951 "name": "BaseBdev2", 00:14:31.951 "uuid": "8ff00ee8-61c9-4f02-ba15-c0d4282b410a", 00:14:31.951 "is_configured": true, 00:14:31.951 "data_offset": 0, 00:14:31.951 "data_size": 65536 00:14:31.951 } 00:14:31.951 ] 00:14:31.951 }' 00:14:31.951 05:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.951 05:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.518 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:32.518 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:32.518 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:32.518 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:32.518 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:32.518 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:32.518 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:32.518 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:32.776 [2024-07-26 05:42:47.581247] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:32.777 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:32.777 "name": "Existed_Raid", 00:14:32.777 "aliases": [ 00:14:32.777 "97912419-c22e-4004-a3d4-52c7806e696b" 00:14:32.777 ], 00:14:32.777 "product_name": "Raid Volume", 00:14:32.777 "block_size": 512, 00:14:32.777 "num_blocks": 65536, 00:14:32.777 "uuid": "97912419-c22e-4004-a3d4-52c7806e696b", 00:14:32.777 "assigned_rate_limits": { 00:14:32.777 "rw_ios_per_sec": 0, 00:14:32.777 "rw_mbytes_per_sec": 0, 00:14:32.777 "r_mbytes_per_sec": 0, 00:14:32.777 "w_mbytes_per_sec": 0 00:14:32.777 }, 00:14:32.777 "claimed": false, 00:14:32.777 "zoned": false, 00:14:32.777 "supported_io_types": { 00:14:32.777 "read": true, 00:14:32.777 "write": true, 00:14:32.777 "unmap": false, 00:14:32.777 "flush": false, 00:14:32.777 "reset": true, 00:14:32.777 "nvme_admin": false, 00:14:32.777 "nvme_io": false, 00:14:32.777 "nvme_io_md": false, 00:14:32.777 "write_zeroes": true, 00:14:32.777 "zcopy": false, 00:14:32.777 "get_zone_info": false, 00:14:32.777 "zone_management": false, 00:14:32.777 "zone_append": false, 00:14:32.777 "compare": false, 00:14:32.777 "compare_and_write": false, 00:14:32.777 "abort": false, 00:14:32.777 "seek_hole": false, 00:14:32.777 "seek_data": false, 00:14:32.777 "copy": false, 00:14:32.777 "nvme_iov_md": false 00:14:32.777 }, 00:14:32.777 "memory_domains": [ 00:14:32.777 { 00:14:32.777 "dma_device_id": "system", 00:14:32.777 "dma_device_type": 1 00:14:32.777 }, 00:14:32.777 { 00:14:32.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.777 "dma_device_type": 2 00:14:32.777 }, 00:14:32.777 { 00:14:32.777 "dma_device_id": "system", 00:14:32.777 "dma_device_type": 1 00:14:32.777 }, 00:14:32.777 { 00:14:32.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.777 "dma_device_type": 2 00:14:32.777 } 00:14:32.777 ], 00:14:32.777 "driver_specific": { 00:14:32.777 "raid": { 00:14:32.777 "uuid": "97912419-c22e-4004-a3d4-52c7806e696b", 00:14:32.777 "strip_size_kb": 0, 00:14:32.777 "state": "online", 00:14:32.777 "raid_level": "raid1", 00:14:32.777 "superblock": false, 00:14:32.777 "num_base_bdevs": 2, 00:14:32.777 "num_base_bdevs_discovered": 2, 00:14:32.777 "num_base_bdevs_operational": 2, 00:14:32.777 "base_bdevs_list": [ 00:14:32.777 { 00:14:32.777 "name": "BaseBdev1", 00:14:32.777 "uuid": "690b7612-6c41-48a1-b744-b8b885d1a283", 00:14:32.777 "is_configured": true, 00:14:32.777 "data_offset": 0, 00:14:32.777 "data_size": 65536 00:14:32.777 }, 00:14:32.777 { 00:14:32.777 "name": "BaseBdev2", 00:14:32.777 "uuid": "8ff00ee8-61c9-4f02-ba15-c0d4282b410a", 00:14:32.777 "is_configured": true, 00:14:32.777 "data_offset": 0, 00:14:32.777 "data_size": 65536 00:14:32.777 } 00:14:32.777 ] 00:14:32.777 } 00:14:32.777 } 00:14:32.777 }' 00:14:32.777 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:32.777 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:32.777 BaseBdev2' 00:14:32.777 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:32.777 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:32.777 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:33.035 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:33.035 "name": "BaseBdev1", 00:14:33.035 "aliases": [ 00:14:33.035 "690b7612-6c41-48a1-b744-b8b885d1a283" 00:14:33.035 ], 00:14:33.035 "product_name": "Malloc disk", 00:14:33.035 "block_size": 512, 00:14:33.035 "num_blocks": 65536, 00:14:33.035 "uuid": "690b7612-6c41-48a1-b744-b8b885d1a283", 00:14:33.035 "assigned_rate_limits": { 00:14:33.035 "rw_ios_per_sec": 0, 00:14:33.035 "rw_mbytes_per_sec": 0, 00:14:33.035 "r_mbytes_per_sec": 0, 00:14:33.035 "w_mbytes_per_sec": 0 00:14:33.035 }, 00:14:33.035 "claimed": true, 00:14:33.035 "claim_type": "exclusive_write", 00:14:33.035 "zoned": false, 00:14:33.035 "supported_io_types": { 00:14:33.035 "read": true, 00:14:33.035 "write": true, 00:14:33.035 "unmap": true, 00:14:33.035 "flush": true, 00:14:33.035 "reset": true, 00:14:33.035 "nvme_admin": false, 00:14:33.035 "nvme_io": false, 00:14:33.035 "nvme_io_md": false, 00:14:33.035 "write_zeroes": true, 00:14:33.035 "zcopy": true, 00:14:33.035 "get_zone_info": false, 00:14:33.035 "zone_management": false, 00:14:33.035 "zone_append": false, 00:14:33.035 "compare": false, 00:14:33.035 "compare_and_write": false, 00:14:33.035 "abort": true, 00:14:33.035 "seek_hole": false, 00:14:33.035 "seek_data": false, 00:14:33.035 "copy": true, 00:14:33.035 "nvme_iov_md": false 00:14:33.035 }, 00:14:33.035 "memory_domains": [ 00:14:33.035 { 00:14:33.035 "dma_device_id": "system", 00:14:33.035 "dma_device_type": 1 00:14:33.035 }, 00:14:33.035 { 00:14:33.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.035 "dma_device_type": 2 00:14:33.035 } 00:14:33.035 ], 00:14:33.035 "driver_specific": {} 00:14:33.035 }' 00:14:33.035 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.035 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.295 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:33.295 05:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:33.295 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:33.553 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:33.553 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:33.553 "name": "BaseBdev2", 00:14:33.553 "aliases": [ 00:14:33.553 "8ff00ee8-61c9-4f02-ba15-c0d4282b410a" 00:14:33.553 ], 00:14:33.553 "product_name": "Malloc disk", 00:14:33.553 "block_size": 512, 00:14:33.553 "num_blocks": 65536, 00:14:33.553 "uuid": "8ff00ee8-61c9-4f02-ba15-c0d4282b410a", 00:14:33.553 "assigned_rate_limits": { 00:14:33.553 "rw_ios_per_sec": 0, 00:14:33.553 "rw_mbytes_per_sec": 0, 00:14:33.553 "r_mbytes_per_sec": 0, 00:14:33.553 "w_mbytes_per_sec": 0 00:14:33.553 }, 00:14:33.553 "claimed": true, 00:14:33.553 "claim_type": "exclusive_write", 00:14:33.553 "zoned": false, 00:14:33.553 "supported_io_types": { 00:14:33.554 "read": true, 00:14:33.554 "write": true, 00:14:33.554 "unmap": true, 00:14:33.554 "flush": true, 00:14:33.554 "reset": true, 00:14:33.554 "nvme_admin": false, 00:14:33.554 "nvme_io": false, 00:14:33.554 "nvme_io_md": false, 00:14:33.554 "write_zeroes": true, 00:14:33.554 "zcopy": true, 00:14:33.554 "get_zone_info": false, 00:14:33.554 "zone_management": false, 00:14:33.554 "zone_append": false, 00:14:33.554 "compare": false, 00:14:33.554 "compare_and_write": false, 00:14:33.554 "abort": true, 00:14:33.554 "seek_hole": false, 00:14:33.554 "seek_data": false, 00:14:33.554 "copy": true, 00:14:33.554 "nvme_iov_md": false 00:14:33.554 }, 00:14:33.554 "memory_domains": [ 00:14:33.554 { 00:14:33.554 "dma_device_id": "system", 00:14:33.554 "dma_device_type": 1 00:14:33.554 }, 00:14:33.554 { 00:14:33.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.554 "dma_device_type": 2 00:14:33.554 } 00:14:33.554 ], 00:14:33.554 "driver_specific": {} 00:14:33.554 }' 00:14:33.554 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.812 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.812 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:33.812 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.812 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.812 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:33.812 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.812 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.812 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.812 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:34.070 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:34.070 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:34.070 05:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:34.638 [2024-07-26 05:42:49.289565] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.638 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.896 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.896 "name": "Existed_Raid", 00:14:34.896 "uuid": "97912419-c22e-4004-a3d4-52c7806e696b", 00:14:34.896 "strip_size_kb": 0, 00:14:34.896 "state": "online", 00:14:34.896 "raid_level": "raid1", 00:14:34.896 "superblock": false, 00:14:34.896 "num_base_bdevs": 2, 00:14:34.896 "num_base_bdevs_discovered": 1, 00:14:34.896 "num_base_bdevs_operational": 1, 00:14:34.896 "base_bdevs_list": [ 00:14:34.896 { 00:14:34.896 "name": null, 00:14:34.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.896 "is_configured": false, 00:14:34.896 "data_offset": 0, 00:14:34.896 "data_size": 65536 00:14:34.896 }, 00:14:34.896 { 00:14:34.896 "name": "BaseBdev2", 00:14:34.896 "uuid": "8ff00ee8-61c9-4f02-ba15-c0d4282b410a", 00:14:34.896 "is_configured": true, 00:14:34.896 "data_offset": 0, 00:14:34.896 "data_size": 65536 00:14:34.896 } 00:14:34.896 ] 00:14:34.896 }' 00:14:34.896 05:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.896 05:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.462 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:35.462 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:35.462 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.462 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:35.721 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:35.721 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:35.721 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:35.979 [2024-07-26 05:42:50.867649] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:35.979 [2024-07-26 05:42:50.867732] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:35.979 [2024-07-26 05:42:50.880234] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:35.979 [2024-07-26 05:42:50.880266] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:35.979 [2024-07-26 05:42:50.880277] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x821000 name Existed_Raid, state offline 00:14:36.237 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:36.237 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:36.237 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.238 05:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1139991 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1139991 ']' 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1139991 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1139991 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1139991' 00:14:36.497 killing process with pid 1139991 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1139991 00:14:36.497 [2024-07-26 05:42:51.211652] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:36.497 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1139991 00:14:36.497 [2024-07-26 05:42:51.212521] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:36.755 05:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:36.755 00:14:36.755 real 0m11.082s 00:14:36.755 user 0m19.716s 00:14:36.755 sys 0m2.071s 00:14:36.755 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:36.755 05:42:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:36.755 ************************************ 00:14:36.755 END TEST raid_state_function_test 00:14:36.755 ************************************ 00:14:36.755 05:42:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:36.755 05:42:51 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:14:36.755 05:42:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:36.755 05:42:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:36.755 05:42:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:36.755 ************************************ 00:14:36.755 START TEST raid_state_function_test_sb 00:14:36.755 ************************************ 00:14:36.755 05:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:14:36.755 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:14:36.755 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:14:36.755 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1141633 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1141633' 00:14:36.756 Process raid pid: 1141633 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1141633 /var/tmp/spdk-raid.sock 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1141633 ']' 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:36.756 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:36.756 05:42:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:36.756 [2024-07-26 05:42:51.584748] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:14:36.756 [2024-07-26 05:42:51.584819] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:37.014 [2024-07-26 05:42:51.717235] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:37.014 [2024-07-26 05:42:51.820184] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:37.014 [2024-07-26 05:42:51.884755] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:37.014 [2024-07-26 05:42:51.884791] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:37.949 [2024-07-26 05:42:52.747983] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:37.949 [2024-07-26 05:42:52.748021] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:37.949 [2024-07-26 05:42:52.748032] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:37.949 [2024-07-26 05:42:52.748044] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.949 05:42:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.208 05:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.208 "name": "Existed_Raid", 00:14:38.208 "uuid": "10a6bcd0-5995-4300-8a0c-b209d684d00e", 00:14:38.208 "strip_size_kb": 0, 00:14:38.208 "state": "configuring", 00:14:38.208 "raid_level": "raid1", 00:14:38.208 "superblock": true, 00:14:38.208 "num_base_bdevs": 2, 00:14:38.208 "num_base_bdevs_discovered": 0, 00:14:38.208 "num_base_bdevs_operational": 2, 00:14:38.208 "base_bdevs_list": [ 00:14:38.208 { 00:14:38.208 "name": "BaseBdev1", 00:14:38.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.208 "is_configured": false, 00:14:38.208 "data_offset": 0, 00:14:38.208 "data_size": 0 00:14:38.208 }, 00:14:38.208 { 00:14:38.208 "name": "BaseBdev2", 00:14:38.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:38.208 "is_configured": false, 00:14:38.208 "data_offset": 0, 00:14:38.208 "data_size": 0 00:14:38.208 } 00:14:38.208 ] 00:14:38.208 }' 00:14:38.208 05:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.208 05:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:38.773 05:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:39.031 [2024-07-26 05:42:53.842745] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:39.031 [2024-07-26 05:42:53.842775] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2425a80 name Existed_Raid, state configuring 00:14:39.031 05:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:39.290 [2024-07-26 05:42:54.087405] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:39.290 [2024-07-26 05:42:54.087433] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:39.290 [2024-07-26 05:42:54.087443] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:39.290 [2024-07-26 05:42:54.087454] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:39.290 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:39.549 [2024-07-26 05:42:54.341991] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:39.549 BaseBdev1 00:14:39.549 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:39.549 05:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:39.549 05:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:39.549 05:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:39.549 05:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:39.549 05:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:39.549 05:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.807 05:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:40.065 [ 00:14:40.065 { 00:14:40.065 "name": "BaseBdev1", 00:14:40.065 "aliases": [ 00:14:40.065 "69c64f29-523d-4a21-b91b-7663d5869640" 00:14:40.065 ], 00:14:40.065 "product_name": "Malloc disk", 00:14:40.065 "block_size": 512, 00:14:40.065 "num_blocks": 65536, 00:14:40.065 "uuid": "69c64f29-523d-4a21-b91b-7663d5869640", 00:14:40.065 "assigned_rate_limits": { 00:14:40.065 "rw_ios_per_sec": 0, 00:14:40.065 "rw_mbytes_per_sec": 0, 00:14:40.065 "r_mbytes_per_sec": 0, 00:14:40.065 "w_mbytes_per_sec": 0 00:14:40.065 }, 00:14:40.065 "claimed": true, 00:14:40.065 "claim_type": "exclusive_write", 00:14:40.065 "zoned": false, 00:14:40.065 "supported_io_types": { 00:14:40.065 "read": true, 00:14:40.065 "write": true, 00:14:40.065 "unmap": true, 00:14:40.065 "flush": true, 00:14:40.065 "reset": true, 00:14:40.065 "nvme_admin": false, 00:14:40.065 "nvme_io": false, 00:14:40.065 "nvme_io_md": false, 00:14:40.065 "write_zeroes": true, 00:14:40.065 "zcopy": true, 00:14:40.065 "get_zone_info": false, 00:14:40.065 "zone_management": false, 00:14:40.065 "zone_append": false, 00:14:40.065 "compare": false, 00:14:40.065 "compare_and_write": false, 00:14:40.065 "abort": true, 00:14:40.065 "seek_hole": false, 00:14:40.065 "seek_data": false, 00:14:40.065 "copy": true, 00:14:40.066 "nvme_iov_md": false 00:14:40.066 }, 00:14:40.066 "memory_domains": [ 00:14:40.066 { 00:14:40.066 "dma_device_id": "system", 00:14:40.066 "dma_device_type": 1 00:14:40.066 }, 00:14:40.066 { 00:14:40.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.066 "dma_device_type": 2 00:14:40.066 } 00:14:40.066 ], 00:14:40.066 "driver_specific": {} 00:14:40.066 } 00:14:40.066 ] 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.066 05:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.324 05:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.324 "name": "Existed_Raid", 00:14:40.324 "uuid": "5e1b1533-c89b-4c1a-9609-3b81af3bf572", 00:14:40.324 "strip_size_kb": 0, 00:14:40.324 "state": "configuring", 00:14:40.324 "raid_level": "raid1", 00:14:40.324 "superblock": true, 00:14:40.324 "num_base_bdevs": 2, 00:14:40.324 "num_base_bdevs_discovered": 1, 00:14:40.324 "num_base_bdevs_operational": 2, 00:14:40.324 "base_bdevs_list": [ 00:14:40.324 { 00:14:40.324 "name": "BaseBdev1", 00:14:40.324 "uuid": "69c64f29-523d-4a21-b91b-7663d5869640", 00:14:40.324 "is_configured": true, 00:14:40.324 "data_offset": 2048, 00:14:40.324 "data_size": 63488 00:14:40.324 }, 00:14:40.324 { 00:14:40.324 "name": "BaseBdev2", 00:14:40.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.324 "is_configured": false, 00:14:40.324 "data_offset": 0, 00:14:40.324 "data_size": 0 00:14:40.324 } 00:14:40.324 ] 00:14:40.324 }' 00:14:40.324 05:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.324 05:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:40.891 05:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:41.149 [2024-07-26 05:42:55.886076] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:41.149 [2024-07-26 05:42:55.886117] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2425350 name Existed_Raid, state configuring 00:14:41.149 05:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:41.408 [2024-07-26 05:42:56.122742] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:41.408 [2024-07-26 05:42:56.124231] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:41.408 [2024-07-26 05:42:56.124265] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.408 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.666 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.666 "name": "Existed_Raid", 00:14:41.666 "uuid": "136691c1-95fb-488b-9b4f-a44e0b7a5fa2", 00:14:41.666 "strip_size_kb": 0, 00:14:41.666 "state": "configuring", 00:14:41.666 "raid_level": "raid1", 00:14:41.666 "superblock": true, 00:14:41.666 "num_base_bdevs": 2, 00:14:41.666 "num_base_bdevs_discovered": 1, 00:14:41.666 "num_base_bdevs_operational": 2, 00:14:41.666 "base_bdevs_list": [ 00:14:41.666 { 00:14:41.666 "name": "BaseBdev1", 00:14:41.666 "uuid": "69c64f29-523d-4a21-b91b-7663d5869640", 00:14:41.666 "is_configured": true, 00:14:41.666 "data_offset": 2048, 00:14:41.666 "data_size": 63488 00:14:41.666 }, 00:14:41.666 { 00:14:41.666 "name": "BaseBdev2", 00:14:41.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.666 "is_configured": false, 00:14:41.666 "data_offset": 0, 00:14:41.666 "data_size": 0 00:14:41.666 } 00:14:41.666 ] 00:14:41.666 }' 00:14:41.666 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.666 05:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:42.247 05:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:42.509 [2024-07-26 05:42:57.217041] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:42.509 [2024-07-26 05:42:57.217192] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2426000 00:14:42.509 [2024-07-26 05:42:57.217206] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:42.509 [2024-07-26 05:42:57.217383] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23400c0 00:14:42.509 [2024-07-26 05:42:57.217511] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2426000 00:14:42.509 [2024-07-26 05:42:57.217521] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2426000 00:14:42.509 [2024-07-26 05:42:57.217612] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:42.509 BaseBdev2 00:14:42.509 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:42.509 05:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:42.509 05:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:42.509 05:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:42.509 05:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:42.509 05:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:42.509 05:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:42.767 05:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:43.026 [ 00:14:43.026 { 00:14:43.026 "name": "BaseBdev2", 00:14:43.026 "aliases": [ 00:14:43.026 "bc29711d-dfda-4c7b-bd97-f75d2559f098" 00:14:43.026 ], 00:14:43.026 "product_name": "Malloc disk", 00:14:43.026 "block_size": 512, 00:14:43.026 "num_blocks": 65536, 00:14:43.026 "uuid": "bc29711d-dfda-4c7b-bd97-f75d2559f098", 00:14:43.026 "assigned_rate_limits": { 00:14:43.026 "rw_ios_per_sec": 0, 00:14:43.026 "rw_mbytes_per_sec": 0, 00:14:43.026 "r_mbytes_per_sec": 0, 00:14:43.026 "w_mbytes_per_sec": 0 00:14:43.026 }, 00:14:43.026 "claimed": true, 00:14:43.026 "claim_type": "exclusive_write", 00:14:43.026 "zoned": false, 00:14:43.026 "supported_io_types": { 00:14:43.026 "read": true, 00:14:43.026 "write": true, 00:14:43.026 "unmap": true, 00:14:43.026 "flush": true, 00:14:43.026 "reset": true, 00:14:43.026 "nvme_admin": false, 00:14:43.026 "nvme_io": false, 00:14:43.026 "nvme_io_md": false, 00:14:43.026 "write_zeroes": true, 00:14:43.026 "zcopy": true, 00:14:43.026 "get_zone_info": false, 00:14:43.026 "zone_management": false, 00:14:43.026 "zone_append": false, 00:14:43.026 "compare": false, 00:14:43.026 "compare_and_write": false, 00:14:43.026 "abort": true, 00:14:43.026 "seek_hole": false, 00:14:43.026 "seek_data": false, 00:14:43.026 "copy": true, 00:14:43.026 "nvme_iov_md": false 00:14:43.026 }, 00:14:43.026 "memory_domains": [ 00:14:43.026 { 00:14:43.026 "dma_device_id": "system", 00:14:43.026 "dma_device_type": 1 00:14:43.026 }, 00:14:43.026 { 00:14:43.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.026 "dma_device_type": 2 00:14:43.026 } 00:14:43.026 ], 00:14:43.026 "driver_specific": {} 00:14:43.026 } 00:14:43.026 ] 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.026 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.284 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.284 "name": "Existed_Raid", 00:14:43.284 "uuid": "136691c1-95fb-488b-9b4f-a44e0b7a5fa2", 00:14:43.284 "strip_size_kb": 0, 00:14:43.284 "state": "online", 00:14:43.284 "raid_level": "raid1", 00:14:43.284 "superblock": true, 00:14:43.285 "num_base_bdevs": 2, 00:14:43.285 "num_base_bdevs_discovered": 2, 00:14:43.285 "num_base_bdevs_operational": 2, 00:14:43.285 "base_bdevs_list": [ 00:14:43.285 { 00:14:43.285 "name": "BaseBdev1", 00:14:43.285 "uuid": "69c64f29-523d-4a21-b91b-7663d5869640", 00:14:43.285 "is_configured": true, 00:14:43.285 "data_offset": 2048, 00:14:43.285 "data_size": 63488 00:14:43.285 }, 00:14:43.285 { 00:14:43.285 "name": "BaseBdev2", 00:14:43.285 "uuid": "bc29711d-dfda-4c7b-bd97-f75d2559f098", 00:14:43.285 "is_configured": true, 00:14:43.285 "data_offset": 2048, 00:14:43.285 "data_size": 63488 00:14:43.285 } 00:14:43.285 ] 00:14:43.285 }' 00:14:43.285 05:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.285 05:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:43.851 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:43.851 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:43.851 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:43.851 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:43.851 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:43.851 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:43.851 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:43.851 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:43.851 [2024-07-26 05:42:58.721291] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:43.851 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:43.851 "name": "Existed_Raid", 00:14:43.851 "aliases": [ 00:14:43.851 "136691c1-95fb-488b-9b4f-a44e0b7a5fa2" 00:14:43.851 ], 00:14:43.851 "product_name": "Raid Volume", 00:14:43.851 "block_size": 512, 00:14:43.851 "num_blocks": 63488, 00:14:43.851 "uuid": "136691c1-95fb-488b-9b4f-a44e0b7a5fa2", 00:14:43.851 "assigned_rate_limits": { 00:14:43.851 "rw_ios_per_sec": 0, 00:14:43.851 "rw_mbytes_per_sec": 0, 00:14:43.851 "r_mbytes_per_sec": 0, 00:14:43.851 "w_mbytes_per_sec": 0 00:14:43.851 }, 00:14:43.851 "claimed": false, 00:14:43.851 "zoned": false, 00:14:43.851 "supported_io_types": { 00:14:43.851 "read": true, 00:14:43.851 "write": true, 00:14:43.851 "unmap": false, 00:14:43.851 "flush": false, 00:14:43.851 "reset": true, 00:14:43.851 "nvme_admin": false, 00:14:43.851 "nvme_io": false, 00:14:43.851 "nvme_io_md": false, 00:14:43.851 "write_zeroes": true, 00:14:43.851 "zcopy": false, 00:14:43.851 "get_zone_info": false, 00:14:43.851 "zone_management": false, 00:14:43.851 "zone_append": false, 00:14:43.851 "compare": false, 00:14:43.851 "compare_and_write": false, 00:14:43.851 "abort": false, 00:14:43.851 "seek_hole": false, 00:14:43.851 "seek_data": false, 00:14:43.851 "copy": false, 00:14:43.851 "nvme_iov_md": false 00:14:43.851 }, 00:14:43.851 "memory_domains": [ 00:14:43.851 { 00:14:43.851 "dma_device_id": "system", 00:14:43.851 "dma_device_type": 1 00:14:43.851 }, 00:14:43.851 { 00:14:43.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.851 "dma_device_type": 2 00:14:43.851 }, 00:14:43.851 { 00:14:43.851 "dma_device_id": "system", 00:14:43.851 "dma_device_type": 1 00:14:43.851 }, 00:14:43.851 { 00:14:43.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.851 "dma_device_type": 2 00:14:43.851 } 00:14:43.851 ], 00:14:43.851 "driver_specific": { 00:14:43.851 "raid": { 00:14:43.851 "uuid": "136691c1-95fb-488b-9b4f-a44e0b7a5fa2", 00:14:43.851 "strip_size_kb": 0, 00:14:43.851 "state": "online", 00:14:43.851 "raid_level": "raid1", 00:14:43.851 "superblock": true, 00:14:43.851 "num_base_bdevs": 2, 00:14:43.851 "num_base_bdevs_discovered": 2, 00:14:43.851 "num_base_bdevs_operational": 2, 00:14:43.851 "base_bdevs_list": [ 00:14:43.851 { 00:14:43.851 "name": "BaseBdev1", 00:14:43.851 "uuid": "69c64f29-523d-4a21-b91b-7663d5869640", 00:14:43.851 "is_configured": true, 00:14:43.851 "data_offset": 2048, 00:14:43.851 "data_size": 63488 00:14:43.851 }, 00:14:43.851 { 00:14:43.851 "name": "BaseBdev2", 00:14:43.851 "uuid": "bc29711d-dfda-4c7b-bd97-f75d2559f098", 00:14:43.851 "is_configured": true, 00:14:43.851 "data_offset": 2048, 00:14:43.851 "data_size": 63488 00:14:43.851 } 00:14:43.851 ] 00:14:43.851 } 00:14:43.852 } 00:14:43.852 }' 00:14:43.852 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:44.110 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:44.110 BaseBdev2' 00:14:44.110 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.110 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:44.110 05:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:44.368 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:44.368 "name": "BaseBdev1", 00:14:44.368 "aliases": [ 00:14:44.368 "69c64f29-523d-4a21-b91b-7663d5869640" 00:14:44.368 ], 00:14:44.368 "product_name": "Malloc disk", 00:14:44.368 "block_size": 512, 00:14:44.368 "num_blocks": 65536, 00:14:44.368 "uuid": "69c64f29-523d-4a21-b91b-7663d5869640", 00:14:44.368 "assigned_rate_limits": { 00:14:44.368 "rw_ios_per_sec": 0, 00:14:44.368 "rw_mbytes_per_sec": 0, 00:14:44.368 "r_mbytes_per_sec": 0, 00:14:44.368 "w_mbytes_per_sec": 0 00:14:44.368 }, 00:14:44.368 "claimed": true, 00:14:44.368 "claim_type": "exclusive_write", 00:14:44.368 "zoned": false, 00:14:44.368 "supported_io_types": { 00:14:44.368 "read": true, 00:14:44.368 "write": true, 00:14:44.368 "unmap": true, 00:14:44.368 "flush": true, 00:14:44.369 "reset": true, 00:14:44.369 "nvme_admin": false, 00:14:44.369 "nvme_io": false, 00:14:44.369 "nvme_io_md": false, 00:14:44.369 "write_zeroes": true, 00:14:44.369 "zcopy": true, 00:14:44.369 "get_zone_info": false, 00:14:44.369 "zone_management": false, 00:14:44.369 "zone_append": false, 00:14:44.369 "compare": false, 00:14:44.369 "compare_and_write": false, 00:14:44.369 "abort": true, 00:14:44.369 "seek_hole": false, 00:14:44.369 "seek_data": false, 00:14:44.369 "copy": true, 00:14:44.369 "nvme_iov_md": false 00:14:44.369 }, 00:14:44.369 "memory_domains": [ 00:14:44.369 { 00:14:44.369 "dma_device_id": "system", 00:14:44.369 "dma_device_type": 1 00:14:44.369 }, 00:14:44.369 { 00:14:44.369 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.369 "dma_device_type": 2 00:14:44.369 } 00:14:44.369 ], 00:14:44.369 "driver_specific": {} 00:14:44.369 }' 00:14:44.369 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.369 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.369 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:44.369 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.369 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.369 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:44.369 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.369 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.627 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:44.627 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.627 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.627 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:44.627 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.627 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:44.627 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:44.886 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:44.886 "name": "BaseBdev2", 00:14:44.886 "aliases": [ 00:14:44.886 "bc29711d-dfda-4c7b-bd97-f75d2559f098" 00:14:44.886 ], 00:14:44.886 "product_name": "Malloc disk", 00:14:44.886 "block_size": 512, 00:14:44.886 "num_blocks": 65536, 00:14:44.886 "uuid": "bc29711d-dfda-4c7b-bd97-f75d2559f098", 00:14:44.886 "assigned_rate_limits": { 00:14:44.886 "rw_ios_per_sec": 0, 00:14:44.886 "rw_mbytes_per_sec": 0, 00:14:44.886 "r_mbytes_per_sec": 0, 00:14:44.886 "w_mbytes_per_sec": 0 00:14:44.886 }, 00:14:44.886 "claimed": true, 00:14:44.886 "claim_type": "exclusive_write", 00:14:44.886 "zoned": false, 00:14:44.886 "supported_io_types": { 00:14:44.886 "read": true, 00:14:44.886 "write": true, 00:14:44.886 "unmap": true, 00:14:44.886 "flush": true, 00:14:44.886 "reset": true, 00:14:44.886 "nvme_admin": false, 00:14:44.886 "nvme_io": false, 00:14:44.886 "nvme_io_md": false, 00:14:44.886 "write_zeroes": true, 00:14:44.886 "zcopy": true, 00:14:44.886 "get_zone_info": false, 00:14:44.886 "zone_management": false, 00:14:44.886 "zone_append": false, 00:14:44.886 "compare": false, 00:14:44.886 "compare_and_write": false, 00:14:44.886 "abort": true, 00:14:44.886 "seek_hole": false, 00:14:44.886 "seek_data": false, 00:14:44.886 "copy": true, 00:14:44.886 "nvme_iov_md": false 00:14:44.886 }, 00:14:44.886 "memory_domains": [ 00:14:44.886 { 00:14:44.886 "dma_device_id": "system", 00:14:44.886 "dma_device_type": 1 00:14:44.886 }, 00:14:44.886 { 00:14:44.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.886 "dma_device_type": 2 00:14:44.886 } 00:14:44.886 ], 00:14:44.886 "driver_specific": {} 00:14:44.886 }' 00:14:44.886 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.886 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.886 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:44.886 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.886 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.192 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:45.192 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.192 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.192 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.192 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.192 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.193 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.193 05:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:45.462 [2024-07-26 05:43:00.197001] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.462 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.720 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:45.720 "name": "Existed_Raid", 00:14:45.720 "uuid": "136691c1-95fb-488b-9b4f-a44e0b7a5fa2", 00:14:45.720 "strip_size_kb": 0, 00:14:45.720 "state": "online", 00:14:45.720 "raid_level": "raid1", 00:14:45.720 "superblock": true, 00:14:45.720 "num_base_bdevs": 2, 00:14:45.720 "num_base_bdevs_discovered": 1, 00:14:45.720 "num_base_bdevs_operational": 1, 00:14:45.720 "base_bdevs_list": [ 00:14:45.720 { 00:14:45.720 "name": null, 00:14:45.720 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.720 "is_configured": false, 00:14:45.720 "data_offset": 2048, 00:14:45.720 "data_size": 63488 00:14:45.720 }, 00:14:45.720 { 00:14:45.720 "name": "BaseBdev2", 00:14:45.720 "uuid": "bc29711d-dfda-4c7b-bd97-f75d2559f098", 00:14:45.720 "is_configured": true, 00:14:45.720 "data_offset": 2048, 00:14:45.720 "data_size": 63488 00:14:45.720 } 00:14:45.720 ] 00:14:45.720 }' 00:14:45.720 05:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:45.720 05:43:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:46.287 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:46.287 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:46.287 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.287 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:46.546 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:46.546 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:46.546 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:46.546 [2024-07-26 05:43:01.421358] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:46.546 [2024-07-26 05:43:01.421443] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:46.546 [2024-07-26 05:43:01.432317] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:46.546 [2024-07-26 05:43:01.432354] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:46.546 [2024-07-26 05:43:01.432365] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2426000 name Existed_Raid, state offline 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1141633 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1141633 ']' 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1141633 00:14:46.805 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:47.062 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:47.063 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1141633 00:14:47.063 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:47.063 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:47.063 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1141633' 00:14:47.063 killing process with pid 1141633 00:14:47.063 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1141633 00:14:47.063 [2024-07-26 05:43:01.760784] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:47.063 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1141633 00:14:47.063 [2024-07-26 05:43:01.761686] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:47.321 05:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:47.321 00:14:47.321 real 0m10.468s 00:14:47.321 user 0m18.548s 00:14:47.321 sys 0m1.991s 00:14:47.321 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:47.321 05:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:47.321 ************************************ 00:14:47.321 END TEST raid_state_function_test_sb 00:14:47.321 ************************************ 00:14:47.321 05:43:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:47.321 05:43:02 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:14:47.321 05:43:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:47.321 05:43:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:47.321 05:43:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:47.321 ************************************ 00:14:47.321 START TEST raid_superblock_test 00:14:47.321 ************************************ 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1143268 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1143268 /var/tmp/spdk-raid.sock 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1143268 ']' 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:47.321 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:47.321 05:43:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.321 [2024-07-26 05:43:02.129046] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:14:47.321 [2024-07-26 05:43:02.129104] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1143268 ] 00:14:47.579 [2024-07-26 05:43:02.242647] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:47.579 [2024-07-26 05:43:02.346419] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:47.579 [2024-07-26 05:43:02.400811] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:47.579 [2024-07-26 05:43:02.400855] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:48.511 malloc1 00:14:48.511 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:48.769 [2024-07-26 05:43:03.548615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:48.769 [2024-07-26 05:43:03.548667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:48.769 [2024-07-26 05:43:03.548687] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1333570 00:14:48.769 [2024-07-26 05:43:03.548700] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:48.769 [2024-07-26 05:43:03.550300] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:48.769 [2024-07-26 05:43:03.550329] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:48.769 pt1 00:14:48.769 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:48.769 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:48.769 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:48.769 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:48.769 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:48.769 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:48.769 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:48.769 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:48.769 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:49.027 malloc2 00:14:49.027 05:43:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:49.286 [2024-07-26 05:43:04.042807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:49.286 [2024-07-26 05:43:04.042852] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:49.286 [2024-07-26 05:43:04.042869] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1334970 00:14:49.286 [2024-07-26 05:43:04.042881] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:49.286 [2024-07-26 05:43:04.044425] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:49.286 [2024-07-26 05:43:04.044453] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:49.286 pt2 00:14:49.286 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:49.286 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:49.286 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:14:49.545 [2024-07-26 05:43:04.231331] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:49.545 [2024-07-26 05:43:04.232520] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:49.545 [2024-07-26 05:43:04.232675] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d7270 00:14:49.545 [2024-07-26 05:43:04.232689] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:49.545 [2024-07-26 05:43:04.232884] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x132b0e0 00:14:49.545 [2024-07-26 05:43:04.233030] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d7270 00:14:49.546 [2024-07-26 05:43:04.233040] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d7270 00:14:49.546 [2024-07-26 05:43:04.233134] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.546 "name": "raid_bdev1", 00:14:49.546 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:14:49.546 "strip_size_kb": 0, 00:14:49.546 "state": "online", 00:14:49.546 "raid_level": "raid1", 00:14:49.546 "superblock": true, 00:14:49.546 "num_base_bdevs": 2, 00:14:49.546 "num_base_bdevs_discovered": 2, 00:14:49.546 "num_base_bdevs_operational": 2, 00:14:49.546 "base_bdevs_list": [ 00:14:49.546 { 00:14:49.546 "name": "pt1", 00:14:49.546 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:49.546 "is_configured": true, 00:14:49.546 "data_offset": 2048, 00:14:49.546 "data_size": 63488 00:14:49.546 }, 00:14:49.546 { 00:14:49.546 "name": "pt2", 00:14:49.546 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:49.546 "is_configured": true, 00:14:49.546 "data_offset": 2048, 00:14:49.546 "data_size": 63488 00:14:49.546 } 00:14:49.546 ] 00:14:49.546 }' 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.546 05:43:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:50.480 [2024-07-26 05:43:05.262276] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:50.480 "name": "raid_bdev1", 00:14:50.480 "aliases": [ 00:14:50.480 "54fb1632-831c-44a5-861e-68d066492451" 00:14:50.480 ], 00:14:50.480 "product_name": "Raid Volume", 00:14:50.480 "block_size": 512, 00:14:50.480 "num_blocks": 63488, 00:14:50.480 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:14:50.480 "assigned_rate_limits": { 00:14:50.480 "rw_ios_per_sec": 0, 00:14:50.480 "rw_mbytes_per_sec": 0, 00:14:50.480 "r_mbytes_per_sec": 0, 00:14:50.480 "w_mbytes_per_sec": 0 00:14:50.480 }, 00:14:50.480 "claimed": false, 00:14:50.480 "zoned": false, 00:14:50.480 "supported_io_types": { 00:14:50.480 "read": true, 00:14:50.480 "write": true, 00:14:50.480 "unmap": false, 00:14:50.480 "flush": false, 00:14:50.480 "reset": true, 00:14:50.480 "nvme_admin": false, 00:14:50.480 "nvme_io": false, 00:14:50.480 "nvme_io_md": false, 00:14:50.480 "write_zeroes": true, 00:14:50.480 "zcopy": false, 00:14:50.480 "get_zone_info": false, 00:14:50.480 "zone_management": false, 00:14:50.480 "zone_append": false, 00:14:50.480 "compare": false, 00:14:50.480 "compare_and_write": false, 00:14:50.480 "abort": false, 00:14:50.480 "seek_hole": false, 00:14:50.480 "seek_data": false, 00:14:50.480 "copy": false, 00:14:50.480 "nvme_iov_md": false 00:14:50.480 }, 00:14:50.480 "memory_domains": [ 00:14:50.480 { 00:14:50.480 "dma_device_id": "system", 00:14:50.480 "dma_device_type": 1 00:14:50.480 }, 00:14:50.480 { 00:14:50.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.480 "dma_device_type": 2 00:14:50.480 }, 00:14:50.480 { 00:14:50.480 "dma_device_id": "system", 00:14:50.480 "dma_device_type": 1 00:14:50.480 }, 00:14:50.480 { 00:14:50.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.480 "dma_device_type": 2 00:14:50.480 } 00:14:50.480 ], 00:14:50.480 "driver_specific": { 00:14:50.480 "raid": { 00:14:50.480 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:14:50.480 "strip_size_kb": 0, 00:14:50.480 "state": "online", 00:14:50.480 "raid_level": "raid1", 00:14:50.480 "superblock": true, 00:14:50.480 "num_base_bdevs": 2, 00:14:50.480 "num_base_bdevs_discovered": 2, 00:14:50.480 "num_base_bdevs_operational": 2, 00:14:50.480 "base_bdevs_list": [ 00:14:50.480 { 00:14:50.480 "name": "pt1", 00:14:50.480 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:50.480 "is_configured": true, 00:14:50.480 "data_offset": 2048, 00:14:50.480 "data_size": 63488 00:14:50.480 }, 00:14:50.480 { 00:14:50.480 "name": "pt2", 00:14:50.480 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:50.480 "is_configured": true, 00:14:50.480 "data_offset": 2048, 00:14:50.480 "data_size": 63488 00:14:50.480 } 00:14:50.480 ] 00:14:50.480 } 00:14:50.480 } 00:14:50.480 }' 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:50.480 pt2' 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:50.480 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:50.738 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:50.738 "name": "pt1", 00:14:50.738 "aliases": [ 00:14:50.738 "00000000-0000-0000-0000-000000000001" 00:14:50.738 ], 00:14:50.738 "product_name": "passthru", 00:14:50.738 "block_size": 512, 00:14:50.738 "num_blocks": 65536, 00:14:50.738 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:50.738 "assigned_rate_limits": { 00:14:50.738 "rw_ios_per_sec": 0, 00:14:50.738 "rw_mbytes_per_sec": 0, 00:14:50.738 "r_mbytes_per_sec": 0, 00:14:50.738 "w_mbytes_per_sec": 0 00:14:50.738 }, 00:14:50.738 "claimed": true, 00:14:50.738 "claim_type": "exclusive_write", 00:14:50.738 "zoned": false, 00:14:50.738 "supported_io_types": { 00:14:50.738 "read": true, 00:14:50.738 "write": true, 00:14:50.738 "unmap": true, 00:14:50.738 "flush": true, 00:14:50.738 "reset": true, 00:14:50.738 "nvme_admin": false, 00:14:50.738 "nvme_io": false, 00:14:50.738 "nvme_io_md": false, 00:14:50.738 "write_zeroes": true, 00:14:50.738 "zcopy": true, 00:14:50.738 "get_zone_info": false, 00:14:50.738 "zone_management": false, 00:14:50.738 "zone_append": false, 00:14:50.738 "compare": false, 00:14:50.738 "compare_and_write": false, 00:14:50.738 "abort": true, 00:14:50.738 "seek_hole": false, 00:14:50.738 "seek_data": false, 00:14:50.738 "copy": true, 00:14:50.738 "nvme_iov_md": false 00:14:50.738 }, 00:14:50.738 "memory_domains": [ 00:14:50.738 { 00:14:50.738 "dma_device_id": "system", 00:14:50.738 "dma_device_type": 1 00:14:50.738 }, 00:14:50.738 { 00:14:50.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.738 "dma_device_type": 2 00:14:50.738 } 00:14:50.738 ], 00:14:50.738 "driver_specific": { 00:14:50.738 "passthru": { 00:14:50.738 "name": "pt1", 00:14:50.738 "base_bdev_name": "malloc1" 00:14:50.738 } 00:14:50.738 } 00:14:50.738 }' 00:14:50.738 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.738 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.996 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:50.996 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.996 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.996 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:50.996 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:50.996 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:50.996 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:50.996 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:50.996 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.254 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:51.254 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:51.254 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:51.254 05:43:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:51.511 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:51.511 "name": "pt2", 00:14:51.511 "aliases": [ 00:14:51.511 "00000000-0000-0000-0000-000000000002" 00:14:51.511 ], 00:14:51.511 "product_name": "passthru", 00:14:51.511 "block_size": 512, 00:14:51.511 "num_blocks": 65536, 00:14:51.511 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:51.511 "assigned_rate_limits": { 00:14:51.511 "rw_ios_per_sec": 0, 00:14:51.511 "rw_mbytes_per_sec": 0, 00:14:51.511 "r_mbytes_per_sec": 0, 00:14:51.511 "w_mbytes_per_sec": 0 00:14:51.511 }, 00:14:51.511 "claimed": true, 00:14:51.511 "claim_type": "exclusive_write", 00:14:51.511 "zoned": false, 00:14:51.511 "supported_io_types": { 00:14:51.511 "read": true, 00:14:51.511 "write": true, 00:14:51.511 "unmap": true, 00:14:51.511 "flush": true, 00:14:51.511 "reset": true, 00:14:51.511 "nvme_admin": false, 00:14:51.511 "nvme_io": false, 00:14:51.511 "nvme_io_md": false, 00:14:51.511 "write_zeroes": true, 00:14:51.511 "zcopy": true, 00:14:51.511 "get_zone_info": false, 00:14:51.511 "zone_management": false, 00:14:51.511 "zone_append": false, 00:14:51.511 "compare": false, 00:14:51.511 "compare_and_write": false, 00:14:51.511 "abort": true, 00:14:51.511 "seek_hole": false, 00:14:51.511 "seek_data": false, 00:14:51.511 "copy": true, 00:14:51.511 "nvme_iov_md": false 00:14:51.511 }, 00:14:51.511 "memory_domains": [ 00:14:51.511 { 00:14:51.512 "dma_device_id": "system", 00:14:51.512 "dma_device_type": 1 00:14:51.512 }, 00:14:51.512 { 00:14:51.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.512 "dma_device_type": 2 00:14:51.512 } 00:14:51.512 ], 00:14:51.512 "driver_specific": { 00:14:51.512 "passthru": { 00:14:51.512 "name": "pt2", 00:14:51.512 "base_bdev_name": "malloc2" 00:14:51.512 } 00:14:51.512 } 00:14:51.512 }' 00:14:51.512 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.512 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.512 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:51.512 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.512 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.512 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:51.512 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.769 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.769 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:51.770 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.770 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.770 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:51.770 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:51.770 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:52.027 [2024-07-26 05:43:06.766238] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:52.027 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=54fb1632-831c-44a5-861e-68d066492451 00:14:52.027 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 54fb1632-831c-44a5-861e-68d066492451 ']' 00:14:52.027 05:43:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:52.286 [2024-07-26 05:43:07.010631] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:52.286 [2024-07-26 05:43:07.010661] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:52.286 [2024-07-26 05:43:07.010718] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:52.286 [2024-07-26 05:43:07.010776] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:52.286 [2024-07-26 05:43:07.010787] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d7270 name raid_bdev1, state offline 00:14:52.286 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.286 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:52.544 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:52.544 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:52.544 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:52.544 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:52.802 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:52.802 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:53.061 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:53.061 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:53.319 05:43:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:14:53.319 [2024-07-26 05:43:08.165670] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:53.319 [2024-07-26 05:43:08.167017] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:53.319 [2024-07-26 05:43:08.167072] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:53.320 [2024-07-26 05:43:08.167114] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:53.320 [2024-07-26 05:43:08.167133] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:53.320 [2024-07-26 05:43:08.167142] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d6ff0 name raid_bdev1, state configuring 00:14:53.320 request: 00:14:53.320 { 00:14:53.320 "name": "raid_bdev1", 00:14:53.320 "raid_level": "raid1", 00:14:53.320 "base_bdevs": [ 00:14:53.320 "malloc1", 00:14:53.320 "malloc2" 00:14:53.320 ], 00:14:53.320 "superblock": false, 00:14:53.320 "method": "bdev_raid_create", 00:14:53.320 "req_id": 1 00:14:53.320 } 00:14:53.320 Got JSON-RPC error response 00:14:53.320 response: 00:14:53.320 { 00:14:53.320 "code": -17, 00:14:53.320 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:53.320 } 00:14:53.320 05:43:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:53.320 05:43:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:53.320 05:43:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:53.320 05:43:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:53.320 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.320 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:53.578 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:53.578 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:53.578 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:53.837 [2024-07-26 05:43:08.594762] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:53.837 [2024-07-26 05:43:08.594803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:53.837 [2024-07-26 05:43:08.594822] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13337a0 00:14:53.837 [2024-07-26 05:43:08.594835] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:53.837 [2024-07-26 05:43:08.596392] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:53.837 [2024-07-26 05:43:08.596419] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:53.837 [2024-07-26 05:43:08.596481] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:53.837 [2024-07-26 05:43:08.596505] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:53.837 pt1 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.837 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:54.095 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.095 "name": "raid_bdev1", 00:14:54.095 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:14:54.095 "strip_size_kb": 0, 00:14:54.095 "state": "configuring", 00:14:54.095 "raid_level": "raid1", 00:14:54.095 "superblock": true, 00:14:54.095 "num_base_bdevs": 2, 00:14:54.095 "num_base_bdevs_discovered": 1, 00:14:54.095 "num_base_bdevs_operational": 2, 00:14:54.095 "base_bdevs_list": [ 00:14:54.095 { 00:14:54.095 "name": "pt1", 00:14:54.095 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:54.095 "is_configured": true, 00:14:54.096 "data_offset": 2048, 00:14:54.096 "data_size": 63488 00:14:54.096 }, 00:14:54.096 { 00:14:54.096 "name": null, 00:14:54.096 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:54.096 "is_configured": false, 00:14:54.096 "data_offset": 2048, 00:14:54.096 "data_size": 63488 00:14:54.096 } 00:14:54.096 ] 00:14:54.096 }' 00:14:54.096 05:43:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.096 05:43:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.663 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:14:54.663 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:54.663 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:54.663 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:54.663 [2024-07-26 05:43:09.557325] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:54.663 [2024-07-26 05:43:09.557379] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:54.663 [2024-07-26 05:43:09.557398] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14cb6f0 00:14:54.663 [2024-07-26 05:43:09.557411] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:54.663 [2024-07-26 05:43:09.557785] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:54.663 [2024-07-26 05:43:09.557804] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:54.663 [2024-07-26 05:43:09.557870] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:54.663 [2024-07-26 05:43:09.557889] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:54.663 [2024-07-26 05:43:09.557989] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14cc590 00:14:54.663 [2024-07-26 05:43:09.557999] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:54.663 [2024-07-26 05:43:09.558167] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x132d540 00:14:54.663 [2024-07-26 05:43:09.558292] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14cc590 00:14:54.663 [2024-07-26 05:43:09.558301] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14cc590 00:14:54.663 [2024-07-26 05:43:09.558395] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:54.663 pt2 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:54.922 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.922 "name": "raid_bdev1", 00:14:54.922 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:14:54.922 "strip_size_kb": 0, 00:14:54.922 "state": "online", 00:14:54.922 "raid_level": "raid1", 00:14:54.922 "superblock": true, 00:14:54.922 "num_base_bdevs": 2, 00:14:54.922 "num_base_bdevs_discovered": 2, 00:14:54.922 "num_base_bdevs_operational": 2, 00:14:54.922 "base_bdevs_list": [ 00:14:54.922 { 00:14:54.922 "name": "pt1", 00:14:54.922 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:54.922 "is_configured": true, 00:14:54.922 "data_offset": 2048, 00:14:54.922 "data_size": 63488 00:14:54.922 }, 00:14:54.922 { 00:14:54.922 "name": "pt2", 00:14:54.922 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:54.922 "is_configured": true, 00:14:54.922 "data_offset": 2048, 00:14:54.922 "data_size": 63488 00:14:54.922 } 00:14:54.923 ] 00:14:54.923 }' 00:14:54.923 05:43:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.923 05:43:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.858 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:55.858 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:55.858 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:55.858 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:55.858 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:55.858 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:55.858 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:55.858 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:55.858 [2024-07-26 05:43:10.576255] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:55.858 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:55.858 "name": "raid_bdev1", 00:14:55.858 "aliases": [ 00:14:55.858 "54fb1632-831c-44a5-861e-68d066492451" 00:14:55.858 ], 00:14:55.858 "product_name": "Raid Volume", 00:14:55.858 "block_size": 512, 00:14:55.858 "num_blocks": 63488, 00:14:55.858 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:14:55.858 "assigned_rate_limits": { 00:14:55.858 "rw_ios_per_sec": 0, 00:14:55.858 "rw_mbytes_per_sec": 0, 00:14:55.858 "r_mbytes_per_sec": 0, 00:14:55.858 "w_mbytes_per_sec": 0 00:14:55.858 }, 00:14:55.858 "claimed": false, 00:14:55.858 "zoned": false, 00:14:55.858 "supported_io_types": { 00:14:55.858 "read": true, 00:14:55.858 "write": true, 00:14:55.858 "unmap": false, 00:14:55.858 "flush": false, 00:14:55.858 "reset": true, 00:14:55.858 "nvme_admin": false, 00:14:55.858 "nvme_io": false, 00:14:55.858 "nvme_io_md": false, 00:14:55.858 "write_zeroes": true, 00:14:55.858 "zcopy": false, 00:14:55.858 "get_zone_info": false, 00:14:55.858 "zone_management": false, 00:14:55.858 "zone_append": false, 00:14:55.858 "compare": false, 00:14:55.858 "compare_and_write": false, 00:14:55.858 "abort": false, 00:14:55.858 "seek_hole": false, 00:14:55.858 "seek_data": false, 00:14:55.858 "copy": false, 00:14:55.858 "nvme_iov_md": false 00:14:55.858 }, 00:14:55.858 "memory_domains": [ 00:14:55.858 { 00:14:55.858 "dma_device_id": "system", 00:14:55.858 "dma_device_type": 1 00:14:55.858 }, 00:14:55.858 { 00:14:55.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.858 "dma_device_type": 2 00:14:55.858 }, 00:14:55.858 { 00:14:55.858 "dma_device_id": "system", 00:14:55.858 "dma_device_type": 1 00:14:55.858 }, 00:14:55.858 { 00:14:55.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:55.858 "dma_device_type": 2 00:14:55.858 } 00:14:55.858 ], 00:14:55.858 "driver_specific": { 00:14:55.858 "raid": { 00:14:55.858 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:14:55.858 "strip_size_kb": 0, 00:14:55.858 "state": "online", 00:14:55.858 "raid_level": "raid1", 00:14:55.858 "superblock": true, 00:14:55.858 "num_base_bdevs": 2, 00:14:55.858 "num_base_bdevs_discovered": 2, 00:14:55.858 "num_base_bdevs_operational": 2, 00:14:55.858 "base_bdevs_list": [ 00:14:55.858 { 00:14:55.858 "name": "pt1", 00:14:55.858 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:55.858 "is_configured": true, 00:14:55.858 "data_offset": 2048, 00:14:55.859 "data_size": 63488 00:14:55.859 }, 00:14:55.859 { 00:14:55.859 "name": "pt2", 00:14:55.859 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:55.859 "is_configured": true, 00:14:55.859 "data_offset": 2048, 00:14:55.859 "data_size": 63488 00:14:55.859 } 00:14:55.859 ] 00:14:55.859 } 00:14:55.859 } 00:14:55.859 }' 00:14:55.859 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:55.859 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:55.859 pt2' 00:14:55.859 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:55.859 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:55.859 05:43:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:56.425 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:56.425 "name": "pt1", 00:14:56.425 "aliases": [ 00:14:56.425 "00000000-0000-0000-0000-000000000001" 00:14:56.425 ], 00:14:56.425 "product_name": "passthru", 00:14:56.425 "block_size": 512, 00:14:56.425 "num_blocks": 65536, 00:14:56.425 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:56.425 "assigned_rate_limits": { 00:14:56.425 "rw_ios_per_sec": 0, 00:14:56.425 "rw_mbytes_per_sec": 0, 00:14:56.425 "r_mbytes_per_sec": 0, 00:14:56.425 "w_mbytes_per_sec": 0 00:14:56.425 }, 00:14:56.425 "claimed": true, 00:14:56.425 "claim_type": "exclusive_write", 00:14:56.425 "zoned": false, 00:14:56.425 "supported_io_types": { 00:14:56.425 "read": true, 00:14:56.425 "write": true, 00:14:56.425 "unmap": true, 00:14:56.425 "flush": true, 00:14:56.425 "reset": true, 00:14:56.425 "nvme_admin": false, 00:14:56.425 "nvme_io": false, 00:14:56.425 "nvme_io_md": false, 00:14:56.425 "write_zeroes": true, 00:14:56.425 "zcopy": true, 00:14:56.425 "get_zone_info": false, 00:14:56.425 "zone_management": false, 00:14:56.425 "zone_append": false, 00:14:56.425 "compare": false, 00:14:56.425 "compare_and_write": false, 00:14:56.425 "abort": true, 00:14:56.425 "seek_hole": false, 00:14:56.425 "seek_data": false, 00:14:56.425 "copy": true, 00:14:56.425 "nvme_iov_md": false 00:14:56.425 }, 00:14:56.425 "memory_domains": [ 00:14:56.425 { 00:14:56.425 "dma_device_id": "system", 00:14:56.425 "dma_device_type": 1 00:14:56.425 }, 00:14:56.425 { 00:14:56.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.425 "dma_device_type": 2 00:14:56.425 } 00:14:56.425 ], 00:14:56.425 "driver_specific": { 00:14:56.425 "passthru": { 00:14:56.425 "name": "pt1", 00:14:56.425 "base_bdev_name": "malloc1" 00:14:56.425 } 00:14:56.425 } 00:14:56.425 }' 00:14:56.425 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.426 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.426 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:56.426 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.426 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:56.685 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:56.943 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:56.943 "name": "pt2", 00:14:56.943 "aliases": [ 00:14:56.943 "00000000-0000-0000-0000-000000000002" 00:14:56.943 ], 00:14:56.944 "product_name": "passthru", 00:14:56.944 "block_size": 512, 00:14:56.944 "num_blocks": 65536, 00:14:56.944 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:56.944 "assigned_rate_limits": { 00:14:56.944 "rw_ios_per_sec": 0, 00:14:56.944 "rw_mbytes_per_sec": 0, 00:14:56.944 "r_mbytes_per_sec": 0, 00:14:56.944 "w_mbytes_per_sec": 0 00:14:56.944 }, 00:14:56.944 "claimed": true, 00:14:56.944 "claim_type": "exclusive_write", 00:14:56.944 "zoned": false, 00:14:56.944 "supported_io_types": { 00:14:56.944 "read": true, 00:14:56.944 "write": true, 00:14:56.944 "unmap": true, 00:14:56.944 "flush": true, 00:14:56.944 "reset": true, 00:14:56.944 "nvme_admin": false, 00:14:56.944 "nvme_io": false, 00:14:56.944 "nvme_io_md": false, 00:14:56.944 "write_zeroes": true, 00:14:56.944 "zcopy": true, 00:14:56.944 "get_zone_info": false, 00:14:56.944 "zone_management": false, 00:14:56.944 "zone_append": false, 00:14:56.944 "compare": false, 00:14:56.944 "compare_and_write": false, 00:14:56.944 "abort": true, 00:14:56.944 "seek_hole": false, 00:14:56.944 "seek_data": false, 00:14:56.944 "copy": true, 00:14:56.944 "nvme_iov_md": false 00:14:56.944 }, 00:14:56.944 "memory_domains": [ 00:14:56.944 { 00:14:56.944 "dma_device_id": "system", 00:14:56.944 "dma_device_type": 1 00:14:56.944 }, 00:14:56.944 { 00:14:56.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.944 "dma_device_type": 2 00:14:56.944 } 00:14:56.944 ], 00:14:56.944 "driver_specific": { 00:14:56.944 "passthru": { 00:14:56.944 "name": "pt2", 00:14:56.944 "base_bdev_name": "malloc2" 00:14:56.944 } 00:14:56.944 } 00:14:56.944 }' 00:14:56.944 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.944 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:56.944 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:56.944 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.202 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.202 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:57.202 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.202 05:43:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.202 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:57.202 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.202 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.202 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:57.460 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:57.460 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:57.460 [2024-07-26 05:43:12.324893] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:57.460 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 54fb1632-831c-44a5-861e-68d066492451 '!=' 54fb1632-831c-44a5-861e-68d066492451 ']' 00:14:57.461 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:14:57.461 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:57.461 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:57.461 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:57.738 [2024-07-26 05:43:12.561283] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.738 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:57.996 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.996 "name": "raid_bdev1", 00:14:57.996 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:14:57.996 "strip_size_kb": 0, 00:14:57.996 "state": "online", 00:14:57.996 "raid_level": "raid1", 00:14:57.996 "superblock": true, 00:14:57.996 "num_base_bdevs": 2, 00:14:57.996 "num_base_bdevs_discovered": 1, 00:14:57.996 "num_base_bdevs_operational": 1, 00:14:57.996 "base_bdevs_list": [ 00:14:57.996 { 00:14:57.996 "name": null, 00:14:57.996 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:57.996 "is_configured": false, 00:14:57.996 "data_offset": 2048, 00:14:57.996 "data_size": 63488 00:14:57.996 }, 00:14:57.996 { 00:14:57.996 "name": "pt2", 00:14:57.996 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:57.996 "is_configured": true, 00:14:57.996 "data_offset": 2048, 00:14:57.996 "data_size": 63488 00:14:57.996 } 00:14:57.996 ] 00:14:57.996 }' 00:14:57.996 05:43:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.996 05:43:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.563 05:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:58.822 [2024-07-26 05:43:13.628080] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:58.822 [2024-07-26 05:43:13.628106] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:58.822 [2024-07-26 05:43:13.628162] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:58.822 [2024-07-26 05:43:13.628203] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:58.822 [2024-07-26 05:43:13.628215] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14cc590 name raid_bdev1, state offline 00:14:58.822 05:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.822 05:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:14:59.080 05:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:14:59.080 05:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:14:59.080 05:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:14:59.080 05:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:59.080 05:43:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:59.339 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:59.339 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:59.339 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:14:59.339 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:59.339 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:14:59.339 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:59.599 [2024-07-26 05:43:14.378021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:59.599 [2024-07-26 05:43:14.378066] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:59.599 [2024-07-26 05:43:14.378083] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1334160 00:14:59.599 [2024-07-26 05:43:14.378096] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:59.599 [2024-07-26 05:43:14.379696] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:59.599 [2024-07-26 05:43:14.379723] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:59.599 [2024-07-26 05:43:14.379786] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:59.599 [2024-07-26 05:43:14.379810] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:59.599 [2024-07-26 05:43:14.379893] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x132a380 00:14:59.599 [2024-07-26 05:43:14.379903] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:59.599 [2024-07-26 05:43:14.380072] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x132ba80 00:14:59.599 [2024-07-26 05:43:14.380192] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x132a380 00:14:59.599 [2024-07-26 05:43:14.380202] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x132a380 00:14:59.599 [2024-07-26 05:43:14.380294] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:59.599 pt2 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.599 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:59.895 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.895 "name": "raid_bdev1", 00:14:59.895 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:14:59.895 "strip_size_kb": 0, 00:14:59.895 "state": "online", 00:14:59.895 "raid_level": "raid1", 00:14:59.895 "superblock": true, 00:14:59.895 "num_base_bdevs": 2, 00:14:59.895 "num_base_bdevs_discovered": 1, 00:14:59.895 "num_base_bdevs_operational": 1, 00:14:59.895 "base_bdevs_list": [ 00:14:59.895 { 00:14:59.895 "name": null, 00:14:59.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.895 "is_configured": false, 00:14:59.895 "data_offset": 2048, 00:14:59.895 "data_size": 63488 00:14:59.895 }, 00:14:59.895 { 00:14:59.895 "name": "pt2", 00:14:59.895 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:59.895 "is_configured": true, 00:14:59.895 "data_offset": 2048, 00:14:59.895 "data_size": 63488 00:14:59.895 } 00:14:59.895 ] 00:14:59.895 }' 00:14:59.895 05:43:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.895 05:43:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.462 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:00.720 [2024-07-26 05:43:15.428809] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:00.720 [2024-07-26 05:43:15.428839] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:00.720 [2024-07-26 05:43:15.428897] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:00.720 [2024-07-26 05:43:15.428944] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:00.720 [2024-07-26 05:43:15.428956] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x132a380 name raid_bdev1, state offline 00:15:00.720 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.720 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:00.978 [2024-07-26 05:43:15.833857] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:00.978 [2024-07-26 05:43:15.833903] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:00.978 [2024-07-26 05:43:15.833920] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d6520 00:15:00.978 [2024-07-26 05:43:15.833933] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:00.978 [2024-07-26 05:43:15.835604] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:00.978 [2024-07-26 05:43:15.835648] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:00.978 [2024-07-26 05:43:15.835719] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:00.978 [2024-07-26 05:43:15.835745] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:00.978 [2024-07-26 05:43:15.835846] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:15:00.978 [2024-07-26 05:43:15.835858] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:00.978 [2024-07-26 05:43:15.835871] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x132b3f0 name raid_bdev1, state configuring 00:15:00.978 [2024-07-26 05:43:15.835894] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:00.978 [2024-07-26 05:43:15.835952] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x132d2b0 00:15:00.978 [2024-07-26 05:43:15.835962] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:00.978 [2024-07-26 05:43:15.836129] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x132a350 00:15:00.978 [2024-07-26 05:43:15.836251] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x132d2b0 00:15:00.978 [2024-07-26 05:43:15.836261] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x132d2b0 00:15:00.978 [2024-07-26 05:43:15.836362] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:00.978 pt1 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.978 05:43:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:01.237 05:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.237 "name": "raid_bdev1", 00:15:01.237 "uuid": "54fb1632-831c-44a5-861e-68d066492451", 00:15:01.237 "strip_size_kb": 0, 00:15:01.237 "state": "online", 00:15:01.237 "raid_level": "raid1", 00:15:01.237 "superblock": true, 00:15:01.237 "num_base_bdevs": 2, 00:15:01.237 "num_base_bdevs_discovered": 1, 00:15:01.237 "num_base_bdevs_operational": 1, 00:15:01.237 "base_bdevs_list": [ 00:15:01.237 { 00:15:01.237 "name": null, 00:15:01.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.237 "is_configured": false, 00:15:01.237 "data_offset": 2048, 00:15:01.237 "data_size": 63488 00:15:01.237 }, 00:15:01.237 { 00:15:01.237 "name": "pt2", 00:15:01.237 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:01.237 "is_configured": true, 00:15:01.237 "data_offset": 2048, 00:15:01.237 "data_size": 63488 00:15:01.237 } 00:15:01.237 ] 00:15:01.237 }' 00:15:01.237 05:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.237 05:43:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.802 05:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:15:01.802 05:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:15:02.061 05:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:15:02.061 05:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:02.061 05:43:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:15:02.320 [2024-07-26 05:43:17.145567] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 54fb1632-831c-44a5-861e-68d066492451 '!=' 54fb1632-831c-44a5-861e-68d066492451 ']' 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1143268 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1143268 ']' 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1143268 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1143268 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1143268' 00:15:02.320 killing process with pid 1143268 00:15:02.320 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1143268 00:15:02.321 [2024-07-26 05:43:17.210459] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:02.321 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1143268 00:15:02.321 [2024-07-26 05:43:17.210520] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:02.321 [2024-07-26 05:43:17.210566] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:02.321 [2024-07-26 05:43:17.210577] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x132d2b0 name raid_bdev1, state offline 00:15:02.580 [2024-07-26 05:43:17.229392] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:02.580 05:43:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:02.580 00:15:02.580 real 0m15.382s 00:15:02.580 user 0m28.002s 00:15:02.580 sys 0m2.758s 00:15:02.580 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:02.580 05:43:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.580 ************************************ 00:15:02.580 END TEST raid_superblock_test 00:15:02.580 ************************************ 00:15:02.838 05:43:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:02.838 05:43:17 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:15:02.838 05:43:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:02.838 05:43:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:02.838 05:43:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:02.838 ************************************ 00:15:02.838 START TEST raid_read_error_test 00:15:02.838 ************************************ 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.JVx7TZRxIk 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1145610 00:15:02.838 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1145610 /var/tmp/spdk-raid.sock 00:15:02.839 05:43:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:02.839 05:43:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1145610 ']' 00:15:02.839 05:43:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:02.839 05:43:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:02.839 05:43:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:02.839 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:02.839 05:43:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:02.839 05:43:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.839 [2024-07-26 05:43:17.621088] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:15:02.839 [2024-07-26 05:43:17.621165] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1145610 ] 00:15:03.097 [2024-07-26 05:43:17.750087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:03.097 [2024-07-26 05:43:17.854326] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:03.097 [2024-07-26 05:43:17.914510] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:03.097 [2024-07-26 05:43:17.914540] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:03.665 05:43:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:03.665 05:43:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:03.665 05:43:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:03.665 05:43:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:03.924 BaseBdev1_malloc 00:15:03.924 05:43:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:04.182 true 00:15:04.182 05:43:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:04.440 [2024-07-26 05:43:19.203979] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:04.440 [2024-07-26 05:43:19.204024] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:04.440 [2024-07-26 05:43:19.204044] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14090d0 00:15:04.440 [2024-07-26 05:43:19.204057] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:04.440 [2024-07-26 05:43:19.205816] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:04.440 [2024-07-26 05:43:19.205847] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:04.440 BaseBdev1 00:15:04.440 05:43:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:04.440 05:43:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:04.698 BaseBdev2_malloc 00:15:04.698 05:43:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:04.956 true 00:15:04.957 05:43:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:05.215 [2024-07-26 05:43:19.942540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:05.215 [2024-07-26 05:43:19.942584] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:05.215 [2024-07-26 05:43:19.942603] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x140d910 00:15:05.215 [2024-07-26 05:43:19.942615] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:05.215 [2024-07-26 05:43:19.944015] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:05.215 [2024-07-26 05:43:19.944043] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:05.215 BaseBdev2 00:15:05.215 05:43:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:15:05.474 [2024-07-26 05:43:20.191230] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:05.474 [2024-07-26 05:43:20.192488] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:05.474 [2024-07-26 05:43:20.192693] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x140f320 00:15:05.474 [2024-07-26 05:43:20.192707] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:05.474 [2024-07-26 05:43:20.192899] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1276d00 00:15:05.474 [2024-07-26 05:43:20.193048] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x140f320 00:15:05.474 [2024-07-26 05:43:20.193058] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x140f320 00:15:05.474 [2024-07-26 05:43:20.193163] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.474 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:05.732 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:05.732 "name": "raid_bdev1", 00:15:05.732 "uuid": "0aaf3533-7d4e-4fdb-9861-37fd53f7d781", 00:15:05.732 "strip_size_kb": 0, 00:15:05.732 "state": "online", 00:15:05.732 "raid_level": "raid1", 00:15:05.732 "superblock": true, 00:15:05.732 "num_base_bdevs": 2, 00:15:05.732 "num_base_bdevs_discovered": 2, 00:15:05.732 "num_base_bdevs_operational": 2, 00:15:05.732 "base_bdevs_list": [ 00:15:05.732 { 00:15:05.732 "name": "BaseBdev1", 00:15:05.732 "uuid": "c9ec8eb6-76a2-5055-bc55-4a4f4d1e9b23", 00:15:05.732 "is_configured": true, 00:15:05.732 "data_offset": 2048, 00:15:05.732 "data_size": 63488 00:15:05.732 }, 00:15:05.732 { 00:15:05.732 "name": "BaseBdev2", 00:15:05.732 "uuid": "d15eef6c-ac95-5343-b402-cba42702f946", 00:15:05.732 "is_configured": true, 00:15:05.732 "data_offset": 2048, 00:15:05.732 "data_size": 63488 00:15:05.732 } 00:15:05.732 ] 00:15:05.732 }' 00:15:05.732 05:43:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:05.732 05:43:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.299 05:43:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:06.299 05:43:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:06.299 [2024-07-26 05:43:21.142038] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x140ac70 00:15:07.234 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.492 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:07.751 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.751 "name": "raid_bdev1", 00:15:07.751 "uuid": "0aaf3533-7d4e-4fdb-9861-37fd53f7d781", 00:15:07.751 "strip_size_kb": 0, 00:15:07.751 "state": "online", 00:15:07.751 "raid_level": "raid1", 00:15:07.751 "superblock": true, 00:15:07.751 "num_base_bdevs": 2, 00:15:07.751 "num_base_bdevs_discovered": 2, 00:15:07.751 "num_base_bdevs_operational": 2, 00:15:07.751 "base_bdevs_list": [ 00:15:07.751 { 00:15:07.751 "name": "BaseBdev1", 00:15:07.751 "uuid": "c9ec8eb6-76a2-5055-bc55-4a4f4d1e9b23", 00:15:07.751 "is_configured": true, 00:15:07.751 "data_offset": 2048, 00:15:07.751 "data_size": 63488 00:15:07.751 }, 00:15:07.751 { 00:15:07.751 "name": "BaseBdev2", 00:15:07.751 "uuid": "d15eef6c-ac95-5343-b402-cba42702f946", 00:15:07.751 "is_configured": true, 00:15:07.751 "data_offset": 2048, 00:15:07.751 "data_size": 63488 00:15:07.751 } 00:15:07.751 ] 00:15:07.751 }' 00:15:07.751 05:43:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.751 05:43:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.318 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:08.577 [2024-07-26 05:43:23.357758] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:08.577 [2024-07-26 05:43:23.357801] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:08.577 [2024-07-26 05:43:23.360956] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:08.577 [2024-07-26 05:43:23.360986] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:08.577 [2024-07-26 05:43:23.361066] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:08.577 [2024-07-26 05:43:23.361077] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x140f320 name raid_bdev1, state offline 00:15:08.577 0 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1145610 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1145610 ']' 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1145610 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1145610 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1145610' 00:15:08.577 killing process with pid 1145610 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1145610 00:15:08.577 [2024-07-26 05:43:23.426956] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:08.577 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1145610 00:15:08.577 [2024-07-26 05:43:23.437786] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.JVx7TZRxIk 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:15:08.835 00:15:08.835 real 0m6.136s 00:15:08.835 user 0m9.555s 00:15:08.835 sys 0m1.068s 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:08.835 05:43:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.835 ************************************ 00:15:08.835 END TEST raid_read_error_test 00:15:08.835 ************************************ 00:15:08.835 05:43:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:08.835 05:43:23 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:15:08.835 05:43:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:08.835 05:43:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:08.835 05:43:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:09.094 ************************************ 00:15:09.094 START TEST raid_write_error_test 00:15:09.094 ************************************ 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.UILAvkDrzq 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1146507 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1146507 /var/tmp/spdk-raid.sock 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1146507 ']' 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:09.094 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:09.094 05:43:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.094 [2024-07-26 05:43:23.830145] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:15:09.094 [2024-07-26 05:43:23.830210] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1146507 ] 00:15:09.094 [2024-07-26 05:43:23.951326] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:09.353 [2024-07-26 05:43:24.058129] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:09.353 [2024-07-26 05:43:24.124522] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:09.353 [2024-07-26 05:43:24.124566] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:09.920 05:43:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:09.920 05:43:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:09.920 05:43:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:09.920 05:43:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:10.179 BaseBdev1_malloc 00:15:10.179 05:43:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:10.438 true 00:15:10.438 05:43:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:11.022 [2024-07-26 05:43:25.730518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:11.022 [2024-07-26 05:43:25.730563] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:11.022 [2024-07-26 05:43:25.730583] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x279d0d0 00:15:11.022 [2024-07-26 05:43:25.730596] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:11.022 [2024-07-26 05:43:25.732478] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:11.022 [2024-07-26 05:43:25.732508] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:11.022 BaseBdev1 00:15:11.022 05:43:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:11.022 05:43:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:11.288 BaseBdev2_malloc 00:15:11.288 05:43:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:11.856 true 00:15:11.856 05:43:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:12.116 [2024-07-26 05:43:26.999075] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:12.116 [2024-07-26 05:43:26.999122] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:12.116 [2024-07-26 05:43:26.999144] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27a1910 00:15:12.116 [2024-07-26 05:43:26.999157] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:12.116 [2024-07-26 05:43:27.000804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:12.116 [2024-07-26 05:43:27.000835] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:12.116 BaseBdev2 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:15:12.375 [2024-07-26 05:43:27.247751] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:12.375 [2024-07-26 05:43:27.249136] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:12.375 [2024-07-26 05:43:27.249330] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x27a3320 00:15:12.375 [2024-07-26 05:43:27.249343] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:12.375 [2024-07-26 05:43:27.249545] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x260ad00 00:15:12.375 [2024-07-26 05:43:27.249708] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27a3320 00:15:12.375 [2024-07-26 05:43:27.249724] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27a3320 00:15:12.375 [2024-07-26 05:43:27.249833] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.375 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:12.634 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:12.634 "name": "raid_bdev1", 00:15:12.634 "uuid": "6f971d6b-c3a9-44f8-9180-8fbea389396c", 00:15:12.634 "strip_size_kb": 0, 00:15:12.634 "state": "online", 00:15:12.634 "raid_level": "raid1", 00:15:12.634 "superblock": true, 00:15:12.634 "num_base_bdevs": 2, 00:15:12.634 "num_base_bdevs_discovered": 2, 00:15:12.634 "num_base_bdevs_operational": 2, 00:15:12.634 "base_bdevs_list": [ 00:15:12.634 { 00:15:12.635 "name": "BaseBdev1", 00:15:12.635 "uuid": "31237ba3-c903-5c88-8ac4-3f80340e9342", 00:15:12.635 "is_configured": true, 00:15:12.635 "data_offset": 2048, 00:15:12.635 "data_size": 63488 00:15:12.635 }, 00:15:12.635 { 00:15:12.635 "name": "BaseBdev2", 00:15:12.635 "uuid": "3cd9fd34-c8e3-50ed-b659-077273878840", 00:15:12.635 "is_configured": true, 00:15:12.635 "data_offset": 2048, 00:15:12.635 "data_size": 63488 00:15:12.635 } 00:15:12.635 ] 00:15:12.635 }' 00:15:12.635 05:43:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:12.635 05:43:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.572 05:43:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:13.572 05:43:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:13.831 [2024-07-26 05:43:28.503370] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x279ec70 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:14.840 [2024-07-26 05:43:29.619651] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:15:14.840 [2024-07-26 05:43:29.619705] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:14.840 [2024-07-26 05:43:29.619888] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x279ec70 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.840 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:15.099 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.099 "name": "raid_bdev1", 00:15:15.099 "uuid": "6f971d6b-c3a9-44f8-9180-8fbea389396c", 00:15:15.099 "strip_size_kb": 0, 00:15:15.099 "state": "online", 00:15:15.099 "raid_level": "raid1", 00:15:15.099 "superblock": true, 00:15:15.099 "num_base_bdevs": 2, 00:15:15.099 "num_base_bdevs_discovered": 1, 00:15:15.099 "num_base_bdevs_operational": 1, 00:15:15.099 "base_bdevs_list": [ 00:15:15.099 { 00:15:15.099 "name": null, 00:15:15.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.099 "is_configured": false, 00:15:15.099 "data_offset": 2048, 00:15:15.099 "data_size": 63488 00:15:15.099 }, 00:15:15.099 { 00:15:15.099 "name": "BaseBdev2", 00:15:15.099 "uuid": "3cd9fd34-c8e3-50ed-b659-077273878840", 00:15:15.099 "is_configured": true, 00:15:15.099 "data_offset": 2048, 00:15:15.099 "data_size": 63488 00:15:15.099 } 00:15:15.099 ] 00:15:15.099 }' 00:15:15.099 05:43:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.100 05:43:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.665 05:43:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:15.924 [2024-07-26 05:43:30.731723] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:15.924 [2024-07-26 05:43:30.731764] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:15.924 [2024-07-26 05:43:30.734889] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:15.924 [2024-07-26 05:43:30.734917] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:15.924 [2024-07-26 05:43:30.734969] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:15.924 [2024-07-26 05:43:30.734980] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27a3320 name raid_bdev1, state offline 00:15:15.924 0 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1146507 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1146507 ']' 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1146507 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1146507 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1146507' 00:15:15.924 killing process with pid 1146507 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1146507 00:15:15.924 [2024-07-26 05:43:30.801401] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:15.924 05:43:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1146507 00:15:15.924 [2024-07-26 05:43:30.811693] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.UILAvkDrzq 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:15:16.182 00:15:16.182 real 0m7.293s 00:15:16.182 user 0m11.727s 00:15:16.182 sys 0m1.221s 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:16.182 05:43:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.182 ************************************ 00:15:16.182 END TEST raid_write_error_test 00:15:16.182 ************************************ 00:15:16.442 05:43:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:16.442 05:43:31 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:15:16.442 05:43:31 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:16.442 05:43:31 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:15:16.442 05:43:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:16.442 05:43:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:16.442 05:43:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:16.442 ************************************ 00:15:16.442 START TEST raid_state_function_test 00:15:16.442 ************************************ 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1147639 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1147639' 00:15:16.442 Process raid pid: 1147639 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1147639 /var/tmp/spdk-raid.sock 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1147639 ']' 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:16.442 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:16.442 05:43:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:16.443 05:43:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.443 [2024-07-26 05:43:31.208373] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:15:16.443 [2024-07-26 05:43:31.208446] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:16.443 [2024-07-26 05:43:31.340406] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:16.701 [2024-07-26 05:43:31.443498] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:16.701 [2024-07-26 05:43:31.502725] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:16.701 [2024-07-26 05:43:31.502763] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:17.268 05:43:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:17.268 05:43:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:17.268 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:17.527 [2024-07-26 05:43:32.307281] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:17.527 [2024-07-26 05:43:32.307321] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:17.527 [2024-07-26 05:43:32.307332] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:17.527 [2024-07-26 05:43:32.307344] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:17.527 [2024-07-26 05:43:32.307353] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:17.527 [2024-07-26 05:43:32.307364] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.527 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.785 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.785 "name": "Existed_Raid", 00:15:17.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.785 "strip_size_kb": 64, 00:15:17.785 "state": "configuring", 00:15:17.785 "raid_level": "raid0", 00:15:17.785 "superblock": false, 00:15:17.785 "num_base_bdevs": 3, 00:15:17.785 "num_base_bdevs_discovered": 0, 00:15:17.785 "num_base_bdevs_operational": 3, 00:15:17.785 "base_bdevs_list": [ 00:15:17.785 { 00:15:17.785 "name": "BaseBdev1", 00:15:17.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.785 "is_configured": false, 00:15:17.785 "data_offset": 0, 00:15:17.785 "data_size": 0 00:15:17.785 }, 00:15:17.785 { 00:15:17.785 "name": "BaseBdev2", 00:15:17.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.785 "is_configured": false, 00:15:17.785 "data_offset": 0, 00:15:17.785 "data_size": 0 00:15:17.785 }, 00:15:17.785 { 00:15:17.785 "name": "BaseBdev3", 00:15:17.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.785 "is_configured": false, 00:15:17.785 "data_offset": 0, 00:15:17.785 "data_size": 0 00:15:17.785 } 00:15:17.785 ] 00:15:17.785 }' 00:15:17.785 05:43:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.785 05:43:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.352 05:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:18.610 [2024-07-26 05:43:33.422094] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:18.610 [2024-07-26 05:43:33.422126] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2390a80 name Existed_Raid, state configuring 00:15:18.610 05:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:18.869 [2024-07-26 05:43:33.666750] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:18.869 [2024-07-26 05:43:33.666779] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:18.869 [2024-07-26 05:43:33.666789] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:18.869 [2024-07-26 05:43:33.666801] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:18.869 [2024-07-26 05:43:33.666810] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:18.869 [2024-07-26 05:43:33.666820] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:18.869 05:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:19.127 [2024-07-26 05:43:33.921327] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:19.127 BaseBdev1 00:15:19.127 05:43:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:19.127 05:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:19.127 05:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:19.127 05:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:19.127 05:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:19.127 05:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:19.127 05:43:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:19.386 05:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:19.644 [ 00:15:19.644 { 00:15:19.644 "name": "BaseBdev1", 00:15:19.644 "aliases": [ 00:15:19.644 "8fa16bea-edb2-459c-9140-ce47cfe96a29" 00:15:19.644 ], 00:15:19.644 "product_name": "Malloc disk", 00:15:19.644 "block_size": 512, 00:15:19.644 "num_blocks": 65536, 00:15:19.644 "uuid": "8fa16bea-edb2-459c-9140-ce47cfe96a29", 00:15:19.644 "assigned_rate_limits": { 00:15:19.644 "rw_ios_per_sec": 0, 00:15:19.644 "rw_mbytes_per_sec": 0, 00:15:19.644 "r_mbytes_per_sec": 0, 00:15:19.644 "w_mbytes_per_sec": 0 00:15:19.644 }, 00:15:19.644 "claimed": true, 00:15:19.644 "claim_type": "exclusive_write", 00:15:19.644 "zoned": false, 00:15:19.644 "supported_io_types": { 00:15:19.644 "read": true, 00:15:19.644 "write": true, 00:15:19.644 "unmap": true, 00:15:19.644 "flush": true, 00:15:19.644 "reset": true, 00:15:19.644 "nvme_admin": false, 00:15:19.644 "nvme_io": false, 00:15:19.644 "nvme_io_md": false, 00:15:19.644 "write_zeroes": true, 00:15:19.644 "zcopy": true, 00:15:19.644 "get_zone_info": false, 00:15:19.644 "zone_management": false, 00:15:19.644 "zone_append": false, 00:15:19.644 "compare": false, 00:15:19.644 "compare_and_write": false, 00:15:19.644 "abort": true, 00:15:19.644 "seek_hole": false, 00:15:19.644 "seek_data": false, 00:15:19.644 "copy": true, 00:15:19.644 "nvme_iov_md": false 00:15:19.644 }, 00:15:19.644 "memory_domains": [ 00:15:19.644 { 00:15:19.644 "dma_device_id": "system", 00:15:19.644 "dma_device_type": 1 00:15:19.644 }, 00:15:19.644 { 00:15:19.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.644 "dma_device_type": 2 00:15:19.644 } 00:15:19.644 ], 00:15:19.644 "driver_specific": {} 00:15:19.644 } 00:15:19.644 ] 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.644 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.902 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.902 "name": "Existed_Raid", 00:15:19.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.902 "strip_size_kb": 64, 00:15:19.902 "state": "configuring", 00:15:19.902 "raid_level": "raid0", 00:15:19.902 "superblock": false, 00:15:19.902 "num_base_bdevs": 3, 00:15:19.902 "num_base_bdevs_discovered": 1, 00:15:19.902 "num_base_bdevs_operational": 3, 00:15:19.902 "base_bdevs_list": [ 00:15:19.902 { 00:15:19.902 "name": "BaseBdev1", 00:15:19.902 "uuid": "8fa16bea-edb2-459c-9140-ce47cfe96a29", 00:15:19.902 "is_configured": true, 00:15:19.902 "data_offset": 0, 00:15:19.902 "data_size": 65536 00:15:19.902 }, 00:15:19.902 { 00:15:19.902 "name": "BaseBdev2", 00:15:19.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.902 "is_configured": false, 00:15:19.902 "data_offset": 0, 00:15:19.902 "data_size": 0 00:15:19.902 }, 00:15:19.902 { 00:15:19.902 "name": "BaseBdev3", 00:15:19.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.902 "is_configured": false, 00:15:19.902 "data_offset": 0, 00:15:19.902 "data_size": 0 00:15:19.902 } 00:15:19.902 ] 00:15:19.902 }' 00:15:19.902 05:43:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.902 05:43:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.466 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:20.724 [2024-07-26 05:43:35.517552] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:20.724 [2024-07-26 05:43:35.517592] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2390310 name Existed_Raid, state configuring 00:15:20.724 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:20.982 [2024-07-26 05:43:35.754204] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:20.982 [2024-07-26 05:43:35.755721] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:20.982 [2024-07-26 05:43:35.755754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:20.982 [2024-07-26 05:43:35.755765] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:20.982 [2024-07-26 05:43:35.755776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.982 05:43:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.240 05:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.240 "name": "Existed_Raid", 00:15:21.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.240 "strip_size_kb": 64, 00:15:21.240 "state": "configuring", 00:15:21.240 "raid_level": "raid0", 00:15:21.240 "superblock": false, 00:15:21.240 "num_base_bdevs": 3, 00:15:21.240 "num_base_bdevs_discovered": 1, 00:15:21.240 "num_base_bdevs_operational": 3, 00:15:21.240 "base_bdevs_list": [ 00:15:21.240 { 00:15:21.240 "name": "BaseBdev1", 00:15:21.240 "uuid": "8fa16bea-edb2-459c-9140-ce47cfe96a29", 00:15:21.240 "is_configured": true, 00:15:21.241 "data_offset": 0, 00:15:21.241 "data_size": 65536 00:15:21.241 }, 00:15:21.241 { 00:15:21.241 "name": "BaseBdev2", 00:15:21.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.241 "is_configured": false, 00:15:21.241 "data_offset": 0, 00:15:21.241 "data_size": 0 00:15:21.241 }, 00:15:21.241 { 00:15:21.241 "name": "BaseBdev3", 00:15:21.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.241 "is_configured": false, 00:15:21.241 "data_offset": 0, 00:15:21.241 "data_size": 0 00:15:21.241 } 00:15:21.241 ] 00:15:21.241 }' 00:15:21.241 05:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.241 05:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.806 05:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:22.064 [2024-07-26 05:43:36.860564] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:22.064 BaseBdev2 00:15:22.064 05:43:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:22.064 05:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:22.064 05:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:22.064 05:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:22.064 05:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:22.064 05:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:22.064 05:43:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:22.322 05:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:22.322 [ 00:15:22.322 { 00:15:22.322 "name": "BaseBdev2", 00:15:22.322 "aliases": [ 00:15:22.322 "7bbfc420-94a6-4c07-8cda-78645b34c2a6" 00:15:22.322 ], 00:15:22.322 "product_name": "Malloc disk", 00:15:22.322 "block_size": 512, 00:15:22.322 "num_blocks": 65536, 00:15:22.322 "uuid": "7bbfc420-94a6-4c07-8cda-78645b34c2a6", 00:15:22.322 "assigned_rate_limits": { 00:15:22.322 "rw_ios_per_sec": 0, 00:15:22.322 "rw_mbytes_per_sec": 0, 00:15:22.322 "r_mbytes_per_sec": 0, 00:15:22.322 "w_mbytes_per_sec": 0 00:15:22.322 }, 00:15:22.322 "claimed": true, 00:15:22.322 "claim_type": "exclusive_write", 00:15:22.322 "zoned": false, 00:15:22.322 "supported_io_types": { 00:15:22.322 "read": true, 00:15:22.322 "write": true, 00:15:22.322 "unmap": true, 00:15:22.322 "flush": true, 00:15:22.322 "reset": true, 00:15:22.322 "nvme_admin": false, 00:15:22.322 "nvme_io": false, 00:15:22.322 "nvme_io_md": false, 00:15:22.322 "write_zeroes": true, 00:15:22.322 "zcopy": true, 00:15:22.322 "get_zone_info": false, 00:15:22.322 "zone_management": false, 00:15:22.322 "zone_append": false, 00:15:22.322 "compare": false, 00:15:22.322 "compare_and_write": false, 00:15:22.322 "abort": true, 00:15:22.322 "seek_hole": false, 00:15:22.322 "seek_data": false, 00:15:22.322 "copy": true, 00:15:22.322 "nvme_iov_md": false 00:15:22.322 }, 00:15:22.322 "memory_domains": [ 00:15:22.322 { 00:15:22.322 "dma_device_id": "system", 00:15:22.322 "dma_device_type": 1 00:15:22.322 }, 00:15:22.322 { 00:15:22.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.322 "dma_device_type": 2 00:15:22.322 } 00:15:22.322 ], 00:15:22.322 "driver_specific": {} 00:15:22.322 } 00:15:22.322 ] 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.580 "name": "Existed_Raid", 00:15:22.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:22.580 "strip_size_kb": 64, 00:15:22.580 "state": "configuring", 00:15:22.580 "raid_level": "raid0", 00:15:22.580 "superblock": false, 00:15:22.580 "num_base_bdevs": 3, 00:15:22.580 "num_base_bdevs_discovered": 2, 00:15:22.580 "num_base_bdevs_operational": 3, 00:15:22.580 "base_bdevs_list": [ 00:15:22.580 { 00:15:22.580 "name": "BaseBdev1", 00:15:22.580 "uuid": "8fa16bea-edb2-459c-9140-ce47cfe96a29", 00:15:22.580 "is_configured": true, 00:15:22.580 "data_offset": 0, 00:15:22.580 "data_size": 65536 00:15:22.580 }, 00:15:22.580 { 00:15:22.580 "name": "BaseBdev2", 00:15:22.580 "uuid": "7bbfc420-94a6-4c07-8cda-78645b34c2a6", 00:15:22.580 "is_configured": true, 00:15:22.580 "data_offset": 0, 00:15:22.580 "data_size": 65536 00:15:22.580 }, 00:15:22.580 { 00:15:22.580 "name": "BaseBdev3", 00:15:22.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:22.580 "is_configured": false, 00:15:22.580 "data_offset": 0, 00:15:22.580 "data_size": 0 00:15:22.580 } 00:15:22.580 ] 00:15:22.580 }' 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.580 05:43:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:23.514 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:23.514 [2024-07-26 05:43:38.215518] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:23.514 [2024-07-26 05:43:38.215555] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2391400 00:15:23.514 [2024-07-26 05:43:38.215564] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:23.514 [2024-07-26 05:43:38.215819] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2390ef0 00:15:23.514 [2024-07-26 05:43:38.215940] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2391400 00:15:23.514 [2024-07-26 05:43:38.215950] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2391400 00:15:23.514 [2024-07-26 05:43:38.216111] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:23.514 BaseBdev3 00:15:23.514 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:23.514 05:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:23.514 05:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:23.514 05:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:23.514 05:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:23.514 05:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:23.514 05:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:23.772 05:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:24.032 [ 00:15:24.032 { 00:15:24.032 "name": "BaseBdev3", 00:15:24.032 "aliases": [ 00:15:24.032 "757cfad0-0fdf-430e-8da0-5c72a0379650" 00:15:24.032 ], 00:15:24.032 "product_name": "Malloc disk", 00:15:24.032 "block_size": 512, 00:15:24.032 "num_blocks": 65536, 00:15:24.032 "uuid": "757cfad0-0fdf-430e-8da0-5c72a0379650", 00:15:24.032 "assigned_rate_limits": { 00:15:24.032 "rw_ios_per_sec": 0, 00:15:24.032 "rw_mbytes_per_sec": 0, 00:15:24.032 "r_mbytes_per_sec": 0, 00:15:24.032 "w_mbytes_per_sec": 0 00:15:24.032 }, 00:15:24.032 "claimed": true, 00:15:24.032 "claim_type": "exclusive_write", 00:15:24.032 "zoned": false, 00:15:24.032 "supported_io_types": { 00:15:24.032 "read": true, 00:15:24.032 "write": true, 00:15:24.032 "unmap": true, 00:15:24.032 "flush": true, 00:15:24.032 "reset": true, 00:15:24.032 "nvme_admin": false, 00:15:24.032 "nvme_io": false, 00:15:24.032 "nvme_io_md": false, 00:15:24.032 "write_zeroes": true, 00:15:24.032 "zcopy": true, 00:15:24.032 "get_zone_info": false, 00:15:24.032 "zone_management": false, 00:15:24.032 "zone_append": false, 00:15:24.032 "compare": false, 00:15:24.032 "compare_and_write": false, 00:15:24.032 "abort": true, 00:15:24.032 "seek_hole": false, 00:15:24.032 "seek_data": false, 00:15:24.032 "copy": true, 00:15:24.032 "nvme_iov_md": false 00:15:24.032 }, 00:15:24.032 "memory_domains": [ 00:15:24.032 { 00:15:24.032 "dma_device_id": "system", 00:15:24.032 "dma_device_type": 1 00:15:24.032 }, 00:15:24.032 { 00:15:24.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.032 "dma_device_type": 2 00:15:24.032 } 00:15:24.032 ], 00:15:24.032 "driver_specific": {} 00:15:24.032 } 00:15:24.032 ] 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.032 "name": "Existed_Raid", 00:15:24.032 "uuid": "634557a3-1755-49ca-84f3-8d55e802b835", 00:15:24.032 "strip_size_kb": 64, 00:15:24.032 "state": "online", 00:15:24.032 "raid_level": "raid0", 00:15:24.032 "superblock": false, 00:15:24.032 "num_base_bdevs": 3, 00:15:24.032 "num_base_bdevs_discovered": 3, 00:15:24.032 "num_base_bdevs_operational": 3, 00:15:24.032 "base_bdevs_list": [ 00:15:24.032 { 00:15:24.032 "name": "BaseBdev1", 00:15:24.032 "uuid": "8fa16bea-edb2-459c-9140-ce47cfe96a29", 00:15:24.032 "is_configured": true, 00:15:24.032 "data_offset": 0, 00:15:24.032 "data_size": 65536 00:15:24.032 }, 00:15:24.032 { 00:15:24.032 "name": "BaseBdev2", 00:15:24.032 "uuid": "7bbfc420-94a6-4c07-8cda-78645b34c2a6", 00:15:24.032 "is_configured": true, 00:15:24.032 "data_offset": 0, 00:15:24.032 "data_size": 65536 00:15:24.032 }, 00:15:24.032 { 00:15:24.032 "name": "BaseBdev3", 00:15:24.032 "uuid": "757cfad0-0fdf-430e-8da0-5c72a0379650", 00:15:24.032 "is_configured": true, 00:15:24.032 "data_offset": 0, 00:15:24.032 "data_size": 65536 00:15:24.032 } 00:15:24.032 ] 00:15:24.032 }' 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.032 05:43:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.601 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:24.601 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:24.601 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:24.601 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:24.601 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:24.601 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:24.601 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:24.601 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:24.861 [2024-07-26 05:43:39.719823] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:24.861 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:24.861 "name": "Existed_Raid", 00:15:24.861 "aliases": [ 00:15:24.861 "634557a3-1755-49ca-84f3-8d55e802b835" 00:15:24.861 ], 00:15:24.861 "product_name": "Raid Volume", 00:15:24.861 "block_size": 512, 00:15:24.861 "num_blocks": 196608, 00:15:24.861 "uuid": "634557a3-1755-49ca-84f3-8d55e802b835", 00:15:24.861 "assigned_rate_limits": { 00:15:24.861 "rw_ios_per_sec": 0, 00:15:24.861 "rw_mbytes_per_sec": 0, 00:15:24.861 "r_mbytes_per_sec": 0, 00:15:24.861 "w_mbytes_per_sec": 0 00:15:24.861 }, 00:15:24.861 "claimed": false, 00:15:24.861 "zoned": false, 00:15:24.861 "supported_io_types": { 00:15:24.861 "read": true, 00:15:24.861 "write": true, 00:15:24.861 "unmap": true, 00:15:24.861 "flush": true, 00:15:24.861 "reset": true, 00:15:24.861 "nvme_admin": false, 00:15:24.861 "nvme_io": false, 00:15:24.861 "nvme_io_md": false, 00:15:24.861 "write_zeroes": true, 00:15:24.861 "zcopy": false, 00:15:24.861 "get_zone_info": false, 00:15:24.861 "zone_management": false, 00:15:24.861 "zone_append": false, 00:15:24.861 "compare": false, 00:15:24.861 "compare_and_write": false, 00:15:24.861 "abort": false, 00:15:24.861 "seek_hole": false, 00:15:24.861 "seek_data": false, 00:15:24.861 "copy": false, 00:15:24.861 "nvme_iov_md": false 00:15:24.861 }, 00:15:24.861 "memory_domains": [ 00:15:24.861 { 00:15:24.861 "dma_device_id": "system", 00:15:24.861 "dma_device_type": 1 00:15:24.861 }, 00:15:24.861 { 00:15:24.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.861 "dma_device_type": 2 00:15:24.861 }, 00:15:24.861 { 00:15:24.861 "dma_device_id": "system", 00:15:24.861 "dma_device_type": 1 00:15:24.861 }, 00:15:24.861 { 00:15:24.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.861 "dma_device_type": 2 00:15:24.861 }, 00:15:24.861 { 00:15:24.861 "dma_device_id": "system", 00:15:24.861 "dma_device_type": 1 00:15:24.861 }, 00:15:24.861 { 00:15:24.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.861 "dma_device_type": 2 00:15:24.861 } 00:15:24.861 ], 00:15:24.861 "driver_specific": { 00:15:24.861 "raid": { 00:15:24.861 "uuid": "634557a3-1755-49ca-84f3-8d55e802b835", 00:15:24.861 "strip_size_kb": 64, 00:15:24.861 "state": "online", 00:15:24.861 "raid_level": "raid0", 00:15:24.861 "superblock": false, 00:15:24.861 "num_base_bdevs": 3, 00:15:24.861 "num_base_bdevs_discovered": 3, 00:15:24.861 "num_base_bdevs_operational": 3, 00:15:24.861 "base_bdevs_list": [ 00:15:24.861 { 00:15:24.861 "name": "BaseBdev1", 00:15:24.861 "uuid": "8fa16bea-edb2-459c-9140-ce47cfe96a29", 00:15:24.861 "is_configured": true, 00:15:24.861 "data_offset": 0, 00:15:24.861 "data_size": 65536 00:15:24.861 }, 00:15:24.861 { 00:15:24.861 "name": "BaseBdev2", 00:15:24.861 "uuid": "7bbfc420-94a6-4c07-8cda-78645b34c2a6", 00:15:24.861 "is_configured": true, 00:15:24.861 "data_offset": 0, 00:15:24.861 "data_size": 65536 00:15:24.861 }, 00:15:24.861 { 00:15:24.861 "name": "BaseBdev3", 00:15:24.861 "uuid": "757cfad0-0fdf-430e-8da0-5c72a0379650", 00:15:24.861 "is_configured": true, 00:15:24.861 "data_offset": 0, 00:15:24.861 "data_size": 65536 00:15:24.861 } 00:15:24.861 ] 00:15:24.861 } 00:15:24.861 } 00:15:24.861 }' 00:15:24.861 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:25.119 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:25.119 BaseBdev2 00:15:25.119 BaseBdev3' 00:15:25.119 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.119 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:25.119 05:43:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:25.378 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:25.378 "name": "BaseBdev1", 00:15:25.378 "aliases": [ 00:15:25.378 "8fa16bea-edb2-459c-9140-ce47cfe96a29" 00:15:25.378 ], 00:15:25.378 "product_name": "Malloc disk", 00:15:25.378 "block_size": 512, 00:15:25.378 "num_blocks": 65536, 00:15:25.378 "uuid": "8fa16bea-edb2-459c-9140-ce47cfe96a29", 00:15:25.378 "assigned_rate_limits": { 00:15:25.378 "rw_ios_per_sec": 0, 00:15:25.378 "rw_mbytes_per_sec": 0, 00:15:25.378 "r_mbytes_per_sec": 0, 00:15:25.378 "w_mbytes_per_sec": 0 00:15:25.378 }, 00:15:25.378 "claimed": true, 00:15:25.378 "claim_type": "exclusive_write", 00:15:25.378 "zoned": false, 00:15:25.378 "supported_io_types": { 00:15:25.378 "read": true, 00:15:25.378 "write": true, 00:15:25.378 "unmap": true, 00:15:25.378 "flush": true, 00:15:25.378 "reset": true, 00:15:25.378 "nvme_admin": false, 00:15:25.378 "nvme_io": false, 00:15:25.378 "nvme_io_md": false, 00:15:25.378 "write_zeroes": true, 00:15:25.378 "zcopy": true, 00:15:25.378 "get_zone_info": false, 00:15:25.378 "zone_management": false, 00:15:25.378 "zone_append": false, 00:15:25.378 "compare": false, 00:15:25.378 "compare_and_write": false, 00:15:25.378 "abort": true, 00:15:25.378 "seek_hole": false, 00:15:25.378 "seek_data": false, 00:15:25.378 "copy": true, 00:15:25.378 "nvme_iov_md": false 00:15:25.378 }, 00:15:25.378 "memory_domains": [ 00:15:25.378 { 00:15:25.378 "dma_device_id": "system", 00:15:25.378 "dma_device_type": 1 00:15:25.378 }, 00:15:25.378 { 00:15:25.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.378 "dma_device_type": 2 00:15:25.378 } 00:15:25.378 ], 00:15:25.378 "driver_specific": {} 00:15:25.378 }' 00:15:25.378 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.378 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.378 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:25.378 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.378 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.378 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:25.378 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.378 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:25.637 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:25.637 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.637 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:25.637 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:25.637 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.637 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:25.637 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:25.896 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:25.896 "name": "BaseBdev2", 00:15:25.896 "aliases": [ 00:15:25.896 "7bbfc420-94a6-4c07-8cda-78645b34c2a6" 00:15:25.896 ], 00:15:25.896 "product_name": "Malloc disk", 00:15:25.896 "block_size": 512, 00:15:25.896 "num_blocks": 65536, 00:15:25.896 "uuid": "7bbfc420-94a6-4c07-8cda-78645b34c2a6", 00:15:25.896 "assigned_rate_limits": { 00:15:25.896 "rw_ios_per_sec": 0, 00:15:25.896 "rw_mbytes_per_sec": 0, 00:15:25.896 "r_mbytes_per_sec": 0, 00:15:25.897 "w_mbytes_per_sec": 0 00:15:25.897 }, 00:15:25.897 "claimed": true, 00:15:25.897 "claim_type": "exclusive_write", 00:15:25.897 "zoned": false, 00:15:25.897 "supported_io_types": { 00:15:25.897 "read": true, 00:15:25.897 "write": true, 00:15:25.897 "unmap": true, 00:15:25.897 "flush": true, 00:15:25.897 "reset": true, 00:15:25.897 "nvme_admin": false, 00:15:25.897 "nvme_io": false, 00:15:25.897 "nvme_io_md": false, 00:15:25.897 "write_zeroes": true, 00:15:25.897 "zcopy": true, 00:15:25.897 "get_zone_info": false, 00:15:25.897 "zone_management": false, 00:15:25.897 "zone_append": false, 00:15:25.897 "compare": false, 00:15:25.897 "compare_and_write": false, 00:15:25.897 "abort": true, 00:15:25.897 "seek_hole": false, 00:15:25.897 "seek_data": false, 00:15:25.897 "copy": true, 00:15:25.897 "nvme_iov_md": false 00:15:25.897 }, 00:15:25.897 "memory_domains": [ 00:15:25.897 { 00:15:25.897 "dma_device_id": "system", 00:15:25.897 "dma_device_type": 1 00:15:25.897 }, 00:15:25.897 { 00:15:25.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.897 "dma_device_type": 2 00:15:25.897 } 00:15:25.897 ], 00:15:25.897 "driver_specific": {} 00:15:25.897 }' 00:15:25.897 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.897 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:25.897 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:25.897 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:25.897 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.156 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.156 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.156 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.156 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.156 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.156 05:43:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.156 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.156 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.156 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.156 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:26.414 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.414 "name": "BaseBdev3", 00:15:26.414 "aliases": [ 00:15:26.414 "757cfad0-0fdf-430e-8da0-5c72a0379650" 00:15:26.414 ], 00:15:26.414 "product_name": "Malloc disk", 00:15:26.414 "block_size": 512, 00:15:26.414 "num_blocks": 65536, 00:15:26.414 "uuid": "757cfad0-0fdf-430e-8da0-5c72a0379650", 00:15:26.414 "assigned_rate_limits": { 00:15:26.415 "rw_ios_per_sec": 0, 00:15:26.415 "rw_mbytes_per_sec": 0, 00:15:26.415 "r_mbytes_per_sec": 0, 00:15:26.415 "w_mbytes_per_sec": 0 00:15:26.415 }, 00:15:26.415 "claimed": true, 00:15:26.415 "claim_type": "exclusive_write", 00:15:26.415 "zoned": false, 00:15:26.415 "supported_io_types": { 00:15:26.415 "read": true, 00:15:26.415 "write": true, 00:15:26.415 "unmap": true, 00:15:26.415 "flush": true, 00:15:26.415 "reset": true, 00:15:26.415 "nvme_admin": false, 00:15:26.415 "nvme_io": false, 00:15:26.415 "nvme_io_md": false, 00:15:26.415 "write_zeroes": true, 00:15:26.415 "zcopy": true, 00:15:26.415 "get_zone_info": false, 00:15:26.415 "zone_management": false, 00:15:26.415 "zone_append": false, 00:15:26.415 "compare": false, 00:15:26.415 "compare_and_write": false, 00:15:26.415 "abort": true, 00:15:26.415 "seek_hole": false, 00:15:26.415 "seek_data": false, 00:15:26.415 "copy": true, 00:15:26.415 "nvme_iov_md": false 00:15:26.415 }, 00:15:26.415 "memory_domains": [ 00:15:26.415 { 00:15:26.415 "dma_device_id": "system", 00:15:26.415 "dma_device_type": 1 00:15:26.415 }, 00:15:26.415 { 00:15:26.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.415 "dma_device_type": 2 00:15:26.415 } 00:15:26.415 ], 00:15:26.415 "driver_specific": {} 00:15:26.415 }' 00:15:26.415 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.415 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.415 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.415 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.415 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.673 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.673 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.673 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.673 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.673 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.673 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.673 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.673 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:26.932 [2024-07-26 05:43:41.640670] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:26.932 [2024-07-26 05:43:41.640697] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:26.932 [2024-07-26 05:43:41.640736] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:26.932 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:26.932 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:26.932 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:26.932 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:26.932 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:26.932 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:15:26.932 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.932 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:26.933 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:26.933 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.933 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:26.933 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.933 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.933 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.933 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.933 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.933 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.192 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.192 "name": "Existed_Raid", 00:15:27.192 "uuid": "634557a3-1755-49ca-84f3-8d55e802b835", 00:15:27.192 "strip_size_kb": 64, 00:15:27.192 "state": "offline", 00:15:27.192 "raid_level": "raid0", 00:15:27.192 "superblock": false, 00:15:27.192 "num_base_bdevs": 3, 00:15:27.192 "num_base_bdevs_discovered": 2, 00:15:27.192 "num_base_bdevs_operational": 2, 00:15:27.192 "base_bdevs_list": [ 00:15:27.192 { 00:15:27.192 "name": null, 00:15:27.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:27.192 "is_configured": false, 00:15:27.192 "data_offset": 0, 00:15:27.192 "data_size": 65536 00:15:27.192 }, 00:15:27.192 { 00:15:27.192 "name": "BaseBdev2", 00:15:27.192 "uuid": "7bbfc420-94a6-4c07-8cda-78645b34c2a6", 00:15:27.192 "is_configured": true, 00:15:27.192 "data_offset": 0, 00:15:27.192 "data_size": 65536 00:15:27.192 }, 00:15:27.192 { 00:15:27.192 "name": "BaseBdev3", 00:15:27.192 "uuid": "757cfad0-0fdf-430e-8da0-5c72a0379650", 00:15:27.192 "is_configured": true, 00:15:27.192 "data_offset": 0, 00:15:27.192 "data_size": 65536 00:15:27.192 } 00:15:27.192 ] 00:15:27.192 }' 00:15:27.192 05:43:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.192 05:43:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.761 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:27.761 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:27.761 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.761 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:28.020 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:28.020 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:28.020 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:28.020 [2024-07-26 05:43:42.917116] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:28.277 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:28.277 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:28.277 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.277 05:43:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:28.536 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:28.536 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:28.536 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:28.536 [2024-07-26 05:43:43.410802] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:28.536 [2024-07-26 05:43:43.410846] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2391400 name Existed_Raid, state offline 00:15:28.536 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:28.536 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:28.536 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.536 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:28.832 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:28.832 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:28.832 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:28.832 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:28.832 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:28.832 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:29.091 BaseBdev2 00:15:29.091 05:43:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:29.091 05:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:29.091 05:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:29.091 05:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:29.091 05:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:29.091 05:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:29.091 05:43:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:29.349 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:29.607 [ 00:15:29.607 { 00:15:29.607 "name": "BaseBdev2", 00:15:29.607 "aliases": [ 00:15:29.607 "69ab9439-4199-4311-8d1d-b6182d7a9aeb" 00:15:29.607 ], 00:15:29.607 "product_name": "Malloc disk", 00:15:29.607 "block_size": 512, 00:15:29.607 "num_blocks": 65536, 00:15:29.607 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:29.607 "assigned_rate_limits": { 00:15:29.607 "rw_ios_per_sec": 0, 00:15:29.607 "rw_mbytes_per_sec": 0, 00:15:29.607 "r_mbytes_per_sec": 0, 00:15:29.607 "w_mbytes_per_sec": 0 00:15:29.607 }, 00:15:29.607 "claimed": false, 00:15:29.607 "zoned": false, 00:15:29.607 "supported_io_types": { 00:15:29.607 "read": true, 00:15:29.607 "write": true, 00:15:29.607 "unmap": true, 00:15:29.607 "flush": true, 00:15:29.607 "reset": true, 00:15:29.607 "nvme_admin": false, 00:15:29.607 "nvme_io": false, 00:15:29.607 "nvme_io_md": false, 00:15:29.607 "write_zeroes": true, 00:15:29.607 "zcopy": true, 00:15:29.607 "get_zone_info": false, 00:15:29.607 "zone_management": false, 00:15:29.607 "zone_append": false, 00:15:29.607 "compare": false, 00:15:29.607 "compare_and_write": false, 00:15:29.607 "abort": true, 00:15:29.607 "seek_hole": false, 00:15:29.607 "seek_data": false, 00:15:29.607 "copy": true, 00:15:29.607 "nvme_iov_md": false 00:15:29.607 }, 00:15:29.607 "memory_domains": [ 00:15:29.607 { 00:15:29.608 "dma_device_id": "system", 00:15:29.608 "dma_device_type": 1 00:15:29.608 }, 00:15:29.608 { 00:15:29.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:29.608 "dma_device_type": 2 00:15:29.608 } 00:15:29.608 ], 00:15:29.608 "driver_specific": {} 00:15:29.608 } 00:15:29.608 ] 00:15:29.608 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:29.608 05:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:29.608 05:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:29.608 05:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:29.866 BaseBdev3 00:15:29.866 05:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:29.866 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:29.866 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:29.866 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:29.866 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:29.866 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:29.866 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:30.126 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:30.126 [ 00:15:30.126 { 00:15:30.126 "name": "BaseBdev3", 00:15:30.126 "aliases": [ 00:15:30.126 "6965884c-cbc0-4a5a-8ce3-4de291e4c93b" 00:15:30.126 ], 00:15:30.126 "product_name": "Malloc disk", 00:15:30.126 "block_size": 512, 00:15:30.126 "num_blocks": 65536, 00:15:30.126 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:30.126 "assigned_rate_limits": { 00:15:30.126 "rw_ios_per_sec": 0, 00:15:30.126 "rw_mbytes_per_sec": 0, 00:15:30.126 "r_mbytes_per_sec": 0, 00:15:30.126 "w_mbytes_per_sec": 0 00:15:30.126 }, 00:15:30.126 "claimed": false, 00:15:30.126 "zoned": false, 00:15:30.126 "supported_io_types": { 00:15:30.126 "read": true, 00:15:30.126 "write": true, 00:15:30.126 "unmap": true, 00:15:30.126 "flush": true, 00:15:30.126 "reset": true, 00:15:30.126 "nvme_admin": false, 00:15:30.126 "nvme_io": false, 00:15:30.126 "nvme_io_md": false, 00:15:30.126 "write_zeroes": true, 00:15:30.126 "zcopy": true, 00:15:30.126 "get_zone_info": false, 00:15:30.126 "zone_management": false, 00:15:30.126 "zone_append": false, 00:15:30.126 "compare": false, 00:15:30.126 "compare_and_write": false, 00:15:30.126 "abort": true, 00:15:30.126 "seek_hole": false, 00:15:30.126 "seek_data": false, 00:15:30.126 "copy": true, 00:15:30.126 "nvme_iov_md": false 00:15:30.126 }, 00:15:30.126 "memory_domains": [ 00:15:30.126 { 00:15:30.126 "dma_device_id": "system", 00:15:30.126 "dma_device_type": 1 00:15:30.126 }, 00:15:30.126 { 00:15:30.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.126 "dma_device_type": 2 00:15:30.126 } 00:15:30.126 ], 00:15:30.126 "driver_specific": {} 00:15:30.126 } 00:15:30.126 ] 00:15:30.126 05:43:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:30.126 05:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:30.126 05:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:30.126 05:43:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:30.384 [2024-07-26 05:43:45.204303] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:30.384 [2024-07-26 05:43:45.204348] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:30.384 [2024-07-26 05:43:45.204367] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:30.384 [2024-07-26 05:43:45.205797] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.384 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.643 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.643 "name": "Existed_Raid", 00:15:30.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.643 "strip_size_kb": 64, 00:15:30.643 "state": "configuring", 00:15:30.643 "raid_level": "raid0", 00:15:30.643 "superblock": false, 00:15:30.643 "num_base_bdevs": 3, 00:15:30.643 "num_base_bdevs_discovered": 2, 00:15:30.643 "num_base_bdevs_operational": 3, 00:15:30.643 "base_bdevs_list": [ 00:15:30.643 { 00:15:30.643 "name": "BaseBdev1", 00:15:30.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.643 "is_configured": false, 00:15:30.643 "data_offset": 0, 00:15:30.643 "data_size": 0 00:15:30.643 }, 00:15:30.643 { 00:15:30.643 "name": "BaseBdev2", 00:15:30.643 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:30.643 "is_configured": true, 00:15:30.643 "data_offset": 0, 00:15:30.643 "data_size": 65536 00:15:30.643 }, 00:15:30.643 { 00:15:30.643 "name": "BaseBdev3", 00:15:30.643 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:30.643 "is_configured": true, 00:15:30.643 "data_offset": 0, 00:15:30.643 "data_size": 65536 00:15:30.643 } 00:15:30.643 ] 00:15:30.643 }' 00:15:30.643 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.643 05:43:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.210 05:43:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:31.469 [2024-07-26 05:43:46.202919] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.469 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.728 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.728 "name": "Existed_Raid", 00:15:31.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.728 "strip_size_kb": 64, 00:15:31.728 "state": "configuring", 00:15:31.728 "raid_level": "raid0", 00:15:31.728 "superblock": false, 00:15:31.728 "num_base_bdevs": 3, 00:15:31.728 "num_base_bdevs_discovered": 1, 00:15:31.728 "num_base_bdevs_operational": 3, 00:15:31.728 "base_bdevs_list": [ 00:15:31.728 { 00:15:31.728 "name": "BaseBdev1", 00:15:31.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.728 "is_configured": false, 00:15:31.728 "data_offset": 0, 00:15:31.728 "data_size": 0 00:15:31.728 }, 00:15:31.728 { 00:15:31.728 "name": null, 00:15:31.728 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:31.728 "is_configured": false, 00:15:31.728 "data_offset": 0, 00:15:31.728 "data_size": 65536 00:15:31.728 }, 00:15:31.728 { 00:15:31.728 "name": "BaseBdev3", 00:15:31.728 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:31.728 "is_configured": true, 00:15:31.728 "data_offset": 0, 00:15:31.728 "data_size": 65536 00:15:31.728 } 00:15:31.728 ] 00:15:31.728 }' 00:15:31.728 05:43:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.728 05:43:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:32.295 05:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.295 05:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:32.555 05:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:32.555 05:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:32.814 [2024-07-26 05:43:47.555069] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:32.814 BaseBdev1 00:15:32.814 05:43:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:32.814 05:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:32.814 05:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:32.814 05:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:32.814 05:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:32.814 05:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:32.814 05:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:33.072 05:43:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:33.331 [ 00:15:33.331 { 00:15:33.331 "name": "BaseBdev1", 00:15:33.331 "aliases": [ 00:15:33.331 "8dda1de8-8ded-441f-a0ca-c75c7c04114f" 00:15:33.331 ], 00:15:33.331 "product_name": "Malloc disk", 00:15:33.331 "block_size": 512, 00:15:33.331 "num_blocks": 65536, 00:15:33.331 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:33.331 "assigned_rate_limits": { 00:15:33.331 "rw_ios_per_sec": 0, 00:15:33.331 "rw_mbytes_per_sec": 0, 00:15:33.331 "r_mbytes_per_sec": 0, 00:15:33.331 "w_mbytes_per_sec": 0 00:15:33.331 }, 00:15:33.331 "claimed": true, 00:15:33.331 "claim_type": "exclusive_write", 00:15:33.331 "zoned": false, 00:15:33.331 "supported_io_types": { 00:15:33.331 "read": true, 00:15:33.331 "write": true, 00:15:33.331 "unmap": true, 00:15:33.331 "flush": true, 00:15:33.331 "reset": true, 00:15:33.331 "nvme_admin": false, 00:15:33.331 "nvme_io": false, 00:15:33.331 "nvme_io_md": false, 00:15:33.331 "write_zeroes": true, 00:15:33.331 "zcopy": true, 00:15:33.331 "get_zone_info": false, 00:15:33.331 "zone_management": false, 00:15:33.331 "zone_append": false, 00:15:33.331 "compare": false, 00:15:33.331 "compare_and_write": false, 00:15:33.331 "abort": true, 00:15:33.331 "seek_hole": false, 00:15:33.331 "seek_data": false, 00:15:33.331 "copy": true, 00:15:33.331 "nvme_iov_md": false 00:15:33.331 }, 00:15:33.331 "memory_domains": [ 00:15:33.331 { 00:15:33.331 "dma_device_id": "system", 00:15:33.331 "dma_device_type": 1 00:15:33.331 }, 00:15:33.331 { 00:15:33.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.331 "dma_device_type": 2 00:15:33.331 } 00:15:33.331 ], 00:15:33.331 "driver_specific": {} 00:15:33.331 } 00:15:33.331 ] 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.331 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.899 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.899 "name": "Existed_Raid", 00:15:33.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.899 "strip_size_kb": 64, 00:15:33.899 "state": "configuring", 00:15:33.899 "raid_level": "raid0", 00:15:33.899 "superblock": false, 00:15:33.899 "num_base_bdevs": 3, 00:15:33.899 "num_base_bdevs_discovered": 2, 00:15:33.899 "num_base_bdevs_operational": 3, 00:15:33.899 "base_bdevs_list": [ 00:15:33.899 { 00:15:33.899 "name": "BaseBdev1", 00:15:33.899 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:33.899 "is_configured": true, 00:15:33.899 "data_offset": 0, 00:15:33.899 "data_size": 65536 00:15:33.899 }, 00:15:33.899 { 00:15:33.899 "name": null, 00:15:33.899 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:33.899 "is_configured": false, 00:15:33.899 "data_offset": 0, 00:15:33.899 "data_size": 65536 00:15:33.899 }, 00:15:33.899 { 00:15:33.899 "name": "BaseBdev3", 00:15:33.899 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:33.899 "is_configured": true, 00:15:33.899 "data_offset": 0, 00:15:33.899 "data_size": 65536 00:15:33.899 } 00:15:33.899 ] 00:15:33.899 }' 00:15:33.899 05:43:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.899 05:43:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.466 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.466 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:34.724 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:34.724 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:34.724 [2024-07-26 05:43:49.620564] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.982 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.982 "name": "Existed_Raid", 00:15:34.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.982 "strip_size_kb": 64, 00:15:34.982 "state": "configuring", 00:15:34.982 "raid_level": "raid0", 00:15:34.982 "superblock": false, 00:15:34.982 "num_base_bdevs": 3, 00:15:34.982 "num_base_bdevs_discovered": 1, 00:15:34.982 "num_base_bdevs_operational": 3, 00:15:34.982 "base_bdevs_list": [ 00:15:34.982 { 00:15:34.982 "name": "BaseBdev1", 00:15:34.982 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:34.982 "is_configured": true, 00:15:34.982 "data_offset": 0, 00:15:34.982 "data_size": 65536 00:15:34.982 }, 00:15:34.982 { 00:15:34.982 "name": null, 00:15:34.982 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:34.982 "is_configured": false, 00:15:34.982 "data_offset": 0, 00:15:34.982 "data_size": 65536 00:15:34.982 }, 00:15:34.982 { 00:15:34.982 "name": null, 00:15:34.982 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:34.982 "is_configured": false, 00:15:34.982 "data_offset": 0, 00:15:34.982 "data_size": 65536 00:15:34.982 } 00:15:34.982 ] 00:15:34.982 }' 00:15:34.983 05:43:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.983 05:43:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.915 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.915 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:35.915 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:35.915 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:36.174 [2024-07-26 05:43:50.932042] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.174 05:43:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:36.432 05:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.432 "name": "Existed_Raid", 00:15:36.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.432 "strip_size_kb": 64, 00:15:36.432 "state": "configuring", 00:15:36.432 "raid_level": "raid0", 00:15:36.432 "superblock": false, 00:15:36.432 "num_base_bdevs": 3, 00:15:36.432 "num_base_bdevs_discovered": 2, 00:15:36.432 "num_base_bdevs_operational": 3, 00:15:36.432 "base_bdevs_list": [ 00:15:36.432 { 00:15:36.432 "name": "BaseBdev1", 00:15:36.432 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:36.432 "is_configured": true, 00:15:36.432 "data_offset": 0, 00:15:36.432 "data_size": 65536 00:15:36.432 }, 00:15:36.432 { 00:15:36.432 "name": null, 00:15:36.432 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:36.432 "is_configured": false, 00:15:36.432 "data_offset": 0, 00:15:36.433 "data_size": 65536 00:15:36.433 }, 00:15:36.433 { 00:15:36.433 "name": "BaseBdev3", 00:15:36.433 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:36.433 "is_configured": true, 00:15:36.433 "data_offset": 0, 00:15:36.433 "data_size": 65536 00:15:36.433 } 00:15:36.433 ] 00:15:36.433 }' 00:15:36.433 05:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.433 05:43:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.000 05:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.000 05:43:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:37.258 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:37.258 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:37.516 [2024-07-26 05:43:52.271644] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.516 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.774 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.774 "name": "Existed_Raid", 00:15:37.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:37.775 "strip_size_kb": 64, 00:15:37.775 "state": "configuring", 00:15:37.775 "raid_level": "raid0", 00:15:37.775 "superblock": false, 00:15:37.775 "num_base_bdevs": 3, 00:15:37.775 "num_base_bdevs_discovered": 1, 00:15:37.775 "num_base_bdevs_operational": 3, 00:15:37.775 "base_bdevs_list": [ 00:15:37.775 { 00:15:37.775 "name": null, 00:15:37.775 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:37.775 "is_configured": false, 00:15:37.775 "data_offset": 0, 00:15:37.775 "data_size": 65536 00:15:37.775 }, 00:15:37.775 { 00:15:37.775 "name": null, 00:15:37.775 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:37.775 "is_configured": false, 00:15:37.775 "data_offset": 0, 00:15:37.775 "data_size": 65536 00:15:37.775 }, 00:15:37.775 { 00:15:37.775 "name": "BaseBdev3", 00:15:37.775 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:37.775 "is_configured": true, 00:15:37.775 "data_offset": 0, 00:15:37.775 "data_size": 65536 00:15:37.775 } 00:15:37.775 ] 00:15:37.775 }' 00:15:37.775 05:43:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.775 05:43:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.340 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.340 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:38.616 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:38.616 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:38.880 [2024-07-26 05:43:53.653814] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.880 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:39.138 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:39.138 "name": "Existed_Raid", 00:15:39.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:39.138 "strip_size_kb": 64, 00:15:39.138 "state": "configuring", 00:15:39.138 "raid_level": "raid0", 00:15:39.138 "superblock": false, 00:15:39.138 "num_base_bdevs": 3, 00:15:39.138 "num_base_bdevs_discovered": 2, 00:15:39.138 "num_base_bdevs_operational": 3, 00:15:39.138 "base_bdevs_list": [ 00:15:39.138 { 00:15:39.138 "name": null, 00:15:39.138 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:39.138 "is_configured": false, 00:15:39.138 "data_offset": 0, 00:15:39.138 "data_size": 65536 00:15:39.138 }, 00:15:39.138 { 00:15:39.138 "name": "BaseBdev2", 00:15:39.138 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:39.138 "is_configured": true, 00:15:39.138 "data_offset": 0, 00:15:39.138 "data_size": 65536 00:15:39.138 }, 00:15:39.138 { 00:15:39.138 "name": "BaseBdev3", 00:15:39.138 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:39.138 "is_configured": true, 00:15:39.138 "data_offset": 0, 00:15:39.138 "data_size": 65536 00:15:39.138 } 00:15:39.138 ] 00:15:39.138 }' 00:15:39.138 05:43:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:39.138 05:43:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.704 05:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.704 05:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:39.962 05:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:39.962 05:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.962 05:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:40.221 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8dda1de8-8ded-441f-a0ca-c75c7c04114f 00:15:40.479 [2024-07-26 05:43:55.265785] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:40.479 [2024-07-26 05:43:55.265825] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x238f450 00:15:40.479 [2024-07-26 05:43:55.265833] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:40.479 [2024-07-26 05:43:55.266030] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2390a50 00:15:40.479 [2024-07-26 05:43:55.266147] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x238f450 00:15:40.479 [2024-07-26 05:43:55.266157] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x238f450 00:15:40.479 [2024-07-26 05:43:55.266323] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:40.479 NewBaseBdev 00:15:40.479 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:40.479 05:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:40.479 05:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:40.479 05:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:40.479 05:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:40.479 05:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:40.479 05:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:40.738 05:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:40.996 [ 00:15:40.996 { 00:15:40.996 "name": "NewBaseBdev", 00:15:40.996 "aliases": [ 00:15:40.996 "8dda1de8-8ded-441f-a0ca-c75c7c04114f" 00:15:40.996 ], 00:15:40.996 "product_name": "Malloc disk", 00:15:40.996 "block_size": 512, 00:15:40.996 "num_blocks": 65536, 00:15:40.996 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:40.996 "assigned_rate_limits": { 00:15:40.996 "rw_ios_per_sec": 0, 00:15:40.996 "rw_mbytes_per_sec": 0, 00:15:40.996 "r_mbytes_per_sec": 0, 00:15:40.996 "w_mbytes_per_sec": 0 00:15:40.996 }, 00:15:40.996 "claimed": true, 00:15:40.996 "claim_type": "exclusive_write", 00:15:40.996 "zoned": false, 00:15:40.996 "supported_io_types": { 00:15:40.996 "read": true, 00:15:40.996 "write": true, 00:15:40.996 "unmap": true, 00:15:40.996 "flush": true, 00:15:40.996 "reset": true, 00:15:40.996 "nvme_admin": false, 00:15:40.996 "nvme_io": false, 00:15:40.996 "nvme_io_md": false, 00:15:40.996 "write_zeroes": true, 00:15:40.996 "zcopy": true, 00:15:40.996 "get_zone_info": false, 00:15:40.996 "zone_management": false, 00:15:40.996 "zone_append": false, 00:15:40.996 "compare": false, 00:15:40.996 "compare_and_write": false, 00:15:40.996 "abort": true, 00:15:40.996 "seek_hole": false, 00:15:40.996 "seek_data": false, 00:15:40.996 "copy": true, 00:15:40.996 "nvme_iov_md": false 00:15:40.996 }, 00:15:40.997 "memory_domains": [ 00:15:40.997 { 00:15:40.997 "dma_device_id": "system", 00:15:40.997 "dma_device_type": 1 00:15:40.997 }, 00:15:40.997 { 00:15:40.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.997 "dma_device_type": 2 00:15:40.997 } 00:15:40.997 ], 00:15:40.997 "driver_specific": {} 00:15:40.997 } 00:15:40.997 ] 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:40.997 05:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.255 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.255 "name": "Existed_Raid", 00:15:41.255 "uuid": "9e73b981-72f0-46af-b81c-a2729dc4ba83", 00:15:41.255 "strip_size_kb": 64, 00:15:41.255 "state": "online", 00:15:41.255 "raid_level": "raid0", 00:15:41.255 "superblock": false, 00:15:41.255 "num_base_bdevs": 3, 00:15:41.255 "num_base_bdevs_discovered": 3, 00:15:41.255 "num_base_bdevs_operational": 3, 00:15:41.255 "base_bdevs_list": [ 00:15:41.255 { 00:15:41.255 "name": "NewBaseBdev", 00:15:41.255 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:41.255 "is_configured": true, 00:15:41.255 "data_offset": 0, 00:15:41.255 "data_size": 65536 00:15:41.255 }, 00:15:41.255 { 00:15:41.255 "name": "BaseBdev2", 00:15:41.255 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:41.255 "is_configured": true, 00:15:41.255 "data_offset": 0, 00:15:41.255 "data_size": 65536 00:15:41.255 }, 00:15:41.255 { 00:15:41.255 "name": "BaseBdev3", 00:15:41.255 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:41.255 "is_configured": true, 00:15:41.255 "data_offset": 0, 00:15:41.255 "data_size": 65536 00:15:41.255 } 00:15:41.255 ] 00:15:41.255 }' 00:15:41.255 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.255 05:43:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.822 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:41.822 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:41.822 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:41.822 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:41.822 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:41.822 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:41.822 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:41.822 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:42.081 [2024-07-26 05:43:56.838255] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:42.081 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:42.081 "name": "Existed_Raid", 00:15:42.081 "aliases": [ 00:15:42.081 "9e73b981-72f0-46af-b81c-a2729dc4ba83" 00:15:42.081 ], 00:15:42.081 "product_name": "Raid Volume", 00:15:42.081 "block_size": 512, 00:15:42.081 "num_blocks": 196608, 00:15:42.081 "uuid": "9e73b981-72f0-46af-b81c-a2729dc4ba83", 00:15:42.081 "assigned_rate_limits": { 00:15:42.081 "rw_ios_per_sec": 0, 00:15:42.081 "rw_mbytes_per_sec": 0, 00:15:42.081 "r_mbytes_per_sec": 0, 00:15:42.081 "w_mbytes_per_sec": 0 00:15:42.081 }, 00:15:42.081 "claimed": false, 00:15:42.081 "zoned": false, 00:15:42.081 "supported_io_types": { 00:15:42.081 "read": true, 00:15:42.081 "write": true, 00:15:42.081 "unmap": true, 00:15:42.081 "flush": true, 00:15:42.081 "reset": true, 00:15:42.081 "nvme_admin": false, 00:15:42.081 "nvme_io": false, 00:15:42.081 "nvme_io_md": false, 00:15:42.081 "write_zeroes": true, 00:15:42.081 "zcopy": false, 00:15:42.081 "get_zone_info": false, 00:15:42.081 "zone_management": false, 00:15:42.081 "zone_append": false, 00:15:42.081 "compare": false, 00:15:42.081 "compare_and_write": false, 00:15:42.081 "abort": false, 00:15:42.081 "seek_hole": false, 00:15:42.081 "seek_data": false, 00:15:42.081 "copy": false, 00:15:42.081 "nvme_iov_md": false 00:15:42.081 }, 00:15:42.081 "memory_domains": [ 00:15:42.081 { 00:15:42.081 "dma_device_id": "system", 00:15:42.081 "dma_device_type": 1 00:15:42.081 }, 00:15:42.081 { 00:15:42.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.081 "dma_device_type": 2 00:15:42.081 }, 00:15:42.081 { 00:15:42.081 "dma_device_id": "system", 00:15:42.081 "dma_device_type": 1 00:15:42.081 }, 00:15:42.081 { 00:15:42.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.081 "dma_device_type": 2 00:15:42.081 }, 00:15:42.081 { 00:15:42.081 "dma_device_id": "system", 00:15:42.081 "dma_device_type": 1 00:15:42.081 }, 00:15:42.081 { 00:15:42.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.081 "dma_device_type": 2 00:15:42.081 } 00:15:42.081 ], 00:15:42.081 "driver_specific": { 00:15:42.081 "raid": { 00:15:42.081 "uuid": "9e73b981-72f0-46af-b81c-a2729dc4ba83", 00:15:42.081 "strip_size_kb": 64, 00:15:42.081 "state": "online", 00:15:42.081 "raid_level": "raid0", 00:15:42.081 "superblock": false, 00:15:42.081 "num_base_bdevs": 3, 00:15:42.081 "num_base_bdevs_discovered": 3, 00:15:42.081 "num_base_bdevs_operational": 3, 00:15:42.081 "base_bdevs_list": [ 00:15:42.081 { 00:15:42.081 "name": "NewBaseBdev", 00:15:42.081 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:42.081 "is_configured": true, 00:15:42.081 "data_offset": 0, 00:15:42.081 "data_size": 65536 00:15:42.081 }, 00:15:42.081 { 00:15:42.081 "name": "BaseBdev2", 00:15:42.081 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:42.081 "is_configured": true, 00:15:42.081 "data_offset": 0, 00:15:42.081 "data_size": 65536 00:15:42.081 }, 00:15:42.081 { 00:15:42.081 "name": "BaseBdev3", 00:15:42.081 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:42.081 "is_configured": true, 00:15:42.081 "data_offset": 0, 00:15:42.081 "data_size": 65536 00:15:42.081 } 00:15:42.081 ] 00:15:42.081 } 00:15:42.081 } 00:15:42.081 }' 00:15:42.081 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:42.081 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:42.081 BaseBdev2 00:15:42.081 BaseBdev3' 00:15:42.081 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.081 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:42.081 05:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.340 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.340 "name": "NewBaseBdev", 00:15:42.340 "aliases": [ 00:15:42.340 "8dda1de8-8ded-441f-a0ca-c75c7c04114f" 00:15:42.340 ], 00:15:42.340 "product_name": "Malloc disk", 00:15:42.340 "block_size": 512, 00:15:42.340 "num_blocks": 65536, 00:15:42.340 "uuid": "8dda1de8-8ded-441f-a0ca-c75c7c04114f", 00:15:42.340 "assigned_rate_limits": { 00:15:42.340 "rw_ios_per_sec": 0, 00:15:42.340 "rw_mbytes_per_sec": 0, 00:15:42.340 "r_mbytes_per_sec": 0, 00:15:42.340 "w_mbytes_per_sec": 0 00:15:42.340 }, 00:15:42.340 "claimed": true, 00:15:42.340 "claim_type": "exclusive_write", 00:15:42.340 "zoned": false, 00:15:42.340 "supported_io_types": { 00:15:42.340 "read": true, 00:15:42.340 "write": true, 00:15:42.340 "unmap": true, 00:15:42.340 "flush": true, 00:15:42.340 "reset": true, 00:15:42.340 "nvme_admin": false, 00:15:42.340 "nvme_io": false, 00:15:42.340 "nvme_io_md": false, 00:15:42.340 "write_zeroes": true, 00:15:42.340 "zcopy": true, 00:15:42.340 "get_zone_info": false, 00:15:42.340 "zone_management": false, 00:15:42.340 "zone_append": false, 00:15:42.340 "compare": false, 00:15:42.340 "compare_and_write": false, 00:15:42.340 "abort": true, 00:15:42.340 "seek_hole": false, 00:15:42.340 "seek_data": false, 00:15:42.340 "copy": true, 00:15:42.340 "nvme_iov_md": false 00:15:42.340 }, 00:15:42.340 "memory_domains": [ 00:15:42.340 { 00:15:42.340 "dma_device_id": "system", 00:15:42.340 "dma_device_type": 1 00:15:42.340 }, 00:15:42.340 { 00:15:42.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.340 "dma_device_type": 2 00:15:42.340 } 00:15:42.340 ], 00:15:42.340 "driver_specific": {} 00:15:42.340 }' 00:15:42.340 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.340 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.340 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:42.340 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:42.638 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.918 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.919 "name": "BaseBdev2", 00:15:42.919 "aliases": [ 00:15:42.919 "69ab9439-4199-4311-8d1d-b6182d7a9aeb" 00:15:42.919 ], 00:15:42.919 "product_name": "Malloc disk", 00:15:42.919 "block_size": 512, 00:15:42.919 "num_blocks": 65536, 00:15:42.919 "uuid": "69ab9439-4199-4311-8d1d-b6182d7a9aeb", 00:15:42.919 "assigned_rate_limits": { 00:15:42.919 "rw_ios_per_sec": 0, 00:15:42.919 "rw_mbytes_per_sec": 0, 00:15:42.919 "r_mbytes_per_sec": 0, 00:15:42.919 "w_mbytes_per_sec": 0 00:15:42.919 }, 00:15:42.919 "claimed": true, 00:15:42.919 "claim_type": "exclusive_write", 00:15:42.919 "zoned": false, 00:15:42.919 "supported_io_types": { 00:15:42.919 "read": true, 00:15:42.919 "write": true, 00:15:42.919 "unmap": true, 00:15:42.919 "flush": true, 00:15:42.919 "reset": true, 00:15:42.919 "nvme_admin": false, 00:15:42.919 "nvme_io": false, 00:15:42.919 "nvme_io_md": false, 00:15:42.919 "write_zeroes": true, 00:15:42.919 "zcopy": true, 00:15:42.919 "get_zone_info": false, 00:15:42.919 "zone_management": false, 00:15:42.919 "zone_append": false, 00:15:42.919 "compare": false, 00:15:42.919 "compare_and_write": false, 00:15:42.919 "abort": true, 00:15:42.919 "seek_hole": false, 00:15:42.919 "seek_data": false, 00:15:42.919 "copy": true, 00:15:42.919 "nvme_iov_md": false 00:15:42.919 }, 00:15:42.919 "memory_domains": [ 00:15:42.919 { 00:15:42.919 "dma_device_id": "system", 00:15:42.919 "dma_device_type": 1 00:15:42.919 }, 00:15:42.919 { 00:15:42.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.919 "dma_device_type": 2 00:15:42.919 } 00:15:42.919 ], 00:15:42.919 "driver_specific": {} 00:15:42.919 }' 00:15:42.919 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.919 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.177 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.177 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.177 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.177 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.177 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.177 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.177 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.177 05:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.177 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.177 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.177 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:43.436 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:43.436 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:43.436 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:43.436 "name": "BaseBdev3", 00:15:43.436 "aliases": [ 00:15:43.436 "6965884c-cbc0-4a5a-8ce3-4de291e4c93b" 00:15:43.436 ], 00:15:43.436 "product_name": "Malloc disk", 00:15:43.436 "block_size": 512, 00:15:43.436 "num_blocks": 65536, 00:15:43.436 "uuid": "6965884c-cbc0-4a5a-8ce3-4de291e4c93b", 00:15:43.436 "assigned_rate_limits": { 00:15:43.436 "rw_ios_per_sec": 0, 00:15:43.436 "rw_mbytes_per_sec": 0, 00:15:43.436 "r_mbytes_per_sec": 0, 00:15:43.436 "w_mbytes_per_sec": 0 00:15:43.436 }, 00:15:43.436 "claimed": true, 00:15:43.436 "claim_type": "exclusive_write", 00:15:43.436 "zoned": false, 00:15:43.436 "supported_io_types": { 00:15:43.436 "read": true, 00:15:43.436 "write": true, 00:15:43.436 "unmap": true, 00:15:43.436 "flush": true, 00:15:43.436 "reset": true, 00:15:43.436 "nvme_admin": false, 00:15:43.436 "nvme_io": false, 00:15:43.436 "nvme_io_md": false, 00:15:43.436 "write_zeroes": true, 00:15:43.436 "zcopy": true, 00:15:43.436 "get_zone_info": false, 00:15:43.436 "zone_management": false, 00:15:43.436 "zone_append": false, 00:15:43.436 "compare": false, 00:15:43.436 "compare_and_write": false, 00:15:43.436 "abort": true, 00:15:43.436 "seek_hole": false, 00:15:43.436 "seek_data": false, 00:15:43.436 "copy": true, 00:15:43.436 "nvme_iov_md": false 00:15:43.436 }, 00:15:43.436 "memory_domains": [ 00:15:43.436 { 00:15:43.436 "dma_device_id": "system", 00:15:43.436 "dma_device_type": 1 00:15:43.436 }, 00:15:43.436 { 00:15:43.436 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.436 "dma_device_type": 2 00:15:43.436 } 00:15:43.436 ], 00:15:43.436 "driver_specific": {} 00:15:43.436 }' 00:15:43.436 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.436 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.693 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.693 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.693 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.693 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.693 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.693 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.693 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.693 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.693 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.952 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.952 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:43.952 [2024-07-26 05:43:58.831235] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:43.952 [2024-07-26 05:43:58.831265] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:43.952 [2024-07-26 05:43:58.831328] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:43.952 [2024-07-26 05:43:58.831381] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:43.952 [2024-07-26 05:43:58.831393] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x238f450 name Existed_Raid, state offline 00:15:43.952 05:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1147639 00:15:43.952 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1147639 ']' 00:15:43.952 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1147639 00:15:43.952 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:43.952 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:43.952 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1147639 00:15:44.210 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:44.210 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:44.210 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1147639' 00:15:44.210 killing process with pid 1147639 00:15:44.210 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1147639 00:15:44.210 [2024-07-26 05:43:58.898147] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:44.210 05:43:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1147639 00:15:44.210 [2024-07-26 05:43:58.928890] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:44.571 00:15:44.571 real 0m28.018s 00:15:44.571 user 0m51.430s 00:15:44.571 sys 0m5.028s 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.571 ************************************ 00:15:44.571 END TEST raid_state_function_test 00:15:44.571 ************************************ 00:15:44.571 05:43:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:44.571 05:43:59 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:15:44.571 05:43:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:44.571 05:43:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:44.571 05:43:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:44.571 ************************************ 00:15:44.571 START TEST raid_state_function_test_sb 00:15:44.571 ************************************ 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1151813 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1151813' 00:15:44.571 Process raid pid: 1151813 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1151813 /var/tmp/spdk-raid.sock 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1151813 ']' 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:44.571 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:44.571 05:43:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:44.571 [2024-07-26 05:43:59.350243] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:15:44.571 [2024-07-26 05:43:59.350375] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:44.830 [2024-07-26 05:43:59.546374] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:44.830 [2024-07-26 05:43:59.643553] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:44.830 [2024-07-26 05:43:59.705032] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:44.830 [2024-07-26 05:43:59.705087] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:45.397 05:44:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:45.397 05:44:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:45.397 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:45.655 [2024-07-26 05:44:00.455396] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:45.655 [2024-07-26 05:44:00.455435] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:45.655 [2024-07-26 05:44:00.455446] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:45.655 [2024-07-26 05:44:00.455457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:45.655 [2024-07-26 05:44:00.455466] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:45.655 [2024-07-26 05:44:00.455477] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.655 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.914 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.914 "name": "Existed_Raid", 00:15:45.914 "uuid": "bdb66e2a-49d9-45ef-8bfa-45a590331af2", 00:15:45.914 "strip_size_kb": 64, 00:15:45.914 "state": "configuring", 00:15:45.914 "raid_level": "raid0", 00:15:45.914 "superblock": true, 00:15:45.914 "num_base_bdevs": 3, 00:15:45.914 "num_base_bdevs_discovered": 0, 00:15:45.914 "num_base_bdevs_operational": 3, 00:15:45.914 "base_bdevs_list": [ 00:15:45.914 { 00:15:45.914 "name": "BaseBdev1", 00:15:45.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.914 "is_configured": false, 00:15:45.914 "data_offset": 0, 00:15:45.914 "data_size": 0 00:15:45.914 }, 00:15:45.914 { 00:15:45.914 "name": "BaseBdev2", 00:15:45.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.914 "is_configured": false, 00:15:45.914 "data_offset": 0, 00:15:45.914 "data_size": 0 00:15:45.914 }, 00:15:45.914 { 00:15:45.914 "name": "BaseBdev3", 00:15:45.914 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.914 "is_configured": false, 00:15:45.914 "data_offset": 0, 00:15:45.914 "data_size": 0 00:15:45.914 } 00:15:45.914 ] 00:15:45.914 }' 00:15:45.914 05:44:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.914 05:44:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:46.480 05:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:46.739 [2024-07-26 05:44:01.534134] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:46.739 [2024-07-26 05:44:01.534165] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24aca80 name Existed_Raid, state configuring 00:15:46.739 05:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:46.996 [2024-07-26 05:44:01.782828] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:46.996 [2024-07-26 05:44:01.782853] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:46.996 [2024-07-26 05:44:01.782862] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:46.996 [2024-07-26 05:44:01.782874] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:46.996 [2024-07-26 05:44:01.782883] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:46.996 [2024-07-26 05:44:01.782893] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:46.996 05:44:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:47.254 [2024-07-26 05:44:02.037438] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:47.254 BaseBdev1 00:15:47.254 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:47.254 05:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:47.254 05:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:47.254 05:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:47.254 05:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:47.254 05:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:47.254 05:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.513 05:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:47.771 [ 00:15:47.771 { 00:15:47.771 "name": "BaseBdev1", 00:15:47.771 "aliases": [ 00:15:47.771 "092fe639-368c-48d8-891d-97b757aa074d" 00:15:47.771 ], 00:15:47.771 "product_name": "Malloc disk", 00:15:47.771 "block_size": 512, 00:15:47.771 "num_blocks": 65536, 00:15:47.771 "uuid": "092fe639-368c-48d8-891d-97b757aa074d", 00:15:47.771 "assigned_rate_limits": { 00:15:47.771 "rw_ios_per_sec": 0, 00:15:47.771 "rw_mbytes_per_sec": 0, 00:15:47.771 "r_mbytes_per_sec": 0, 00:15:47.771 "w_mbytes_per_sec": 0 00:15:47.771 }, 00:15:47.771 "claimed": true, 00:15:47.771 "claim_type": "exclusive_write", 00:15:47.771 "zoned": false, 00:15:47.771 "supported_io_types": { 00:15:47.771 "read": true, 00:15:47.771 "write": true, 00:15:47.771 "unmap": true, 00:15:47.771 "flush": true, 00:15:47.771 "reset": true, 00:15:47.771 "nvme_admin": false, 00:15:47.771 "nvme_io": false, 00:15:47.771 "nvme_io_md": false, 00:15:47.771 "write_zeroes": true, 00:15:47.771 "zcopy": true, 00:15:47.771 "get_zone_info": false, 00:15:47.771 "zone_management": false, 00:15:47.771 "zone_append": false, 00:15:47.771 "compare": false, 00:15:47.771 "compare_and_write": false, 00:15:47.771 "abort": true, 00:15:47.771 "seek_hole": false, 00:15:47.771 "seek_data": false, 00:15:47.771 "copy": true, 00:15:47.771 "nvme_iov_md": false 00:15:47.771 }, 00:15:47.771 "memory_domains": [ 00:15:47.771 { 00:15:47.771 "dma_device_id": "system", 00:15:47.771 "dma_device_type": 1 00:15:47.771 }, 00:15:47.771 { 00:15:47.771 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.771 "dma_device_type": 2 00:15:47.771 } 00:15:47.771 ], 00:15:47.771 "driver_specific": {} 00:15:47.771 } 00:15:47.771 ] 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.771 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.030 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.030 "name": "Existed_Raid", 00:15:48.030 "uuid": "76f7f6db-e894-438f-be23-8259ab86bff3", 00:15:48.030 "strip_size_kb": 64, 00:15:48.030 "state": "configuring", 00:15:48.030 "raid_level": "raid0", 00:15:48.030 "superblock": true, 00:15:48.030 "num_base_bdevs": 3, 00:15:48.030 "num_base_bdevs_discovered": 1, 00:15:48.030 "num_base_bdevs_operational": 3, 00:15:48.030 "base_bdevs_list": [ 00:15:48.030 { 00:15:48.030 "name": "BaseBdev1", 00:15:48.030 "uuid": "092fe639-368c-48d8-891d-97b757aa074d", 00:15:48.030 "is_configured": true, 00:15:48.030 "data_offset": 2048, 00:15:48.030 "data_size": 63488 00:15:48.030 }, 00:15:48.030 { 00:15:48.030 "name": "BaseBdev2", 00:15:48.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.030 "is_configured": false, 00:15:48.030 "data_offset": 0, 00:15:48.030 "data_size": 0 00:15:48.030 }, 00:15:48.030 { 00:15:48.030 "name": "BaseBdev3", 00:15:48.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.030 "is_configured": false, 00:15:48.030 "data_offset": 0, 00:15:48.030 "data_size": 0 00:15:48.030 } 00:15:48.030 ] 00:15:48.030 }' 00:15:48.030 05:44:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.030 05:44:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:48.610 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:48.868 [2024-07-26 05:44:03.605580] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:48.868 [2024-07-26 05:44:03.605617] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24ac310 name Existed_Raid, state configuring 00:15:48.868 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:49.126 [2024-07-26 05:44:03.854286] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:49.126 [2024-07-26 05:44:03.855730] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:49.126 [2024-07-26 05:44:03.855762] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:49.126 [2024-07-26 05:44:03.855772] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:49.126 [2024-07-26 05:44:03.855784] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.126 05:44:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:49.385 05:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.385 "name": "Existed_Raid", 00:15:49.385 "uuid": "c3eff658-f198-4023-ab87-343c185a81bd", 00:15:49.385 "strip_size_kb": 64, 00:15:49.385 "state": "configuring", 00:15:49.385 "raid_level": "raid0", 00:15:49.385 "superblock": true, 00:15:49.385 "num_base_bdevs": 3, 00:15:49.385 "num_base_bdevs_discovered": 1, 00:15:49.385 "num_base_bdevs_operational": 3, 00:15:49.385 "base_bdevs_list": [ 00:15:49.385 { 00:15:49.385 "name": "BaseBdev1", 00:15:49.385 "uuid": "092fe639-368c-48d8-891d-97b757aa074d", 00:15:49.385 "is_configured": true, 00:15:49.385 "data_offset": 2048, 00:15:49.385 "data_size": 63488 00:15:49.385 }, 00:15:49.385 { 00:15:49.385 "name": "BaseBdev2", 00:15:49.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.385 "is_configured": false, 00:15:49.385 "data_offset": 0, 00:15:49.385 "data_size": 0 00:15:49.385 }, 00:15:49.385 { 00:15:49.385 "name": "BaseBdev3", 00:15:49.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:49.385 "is_configured": false, 00:15:49.385 "data_offset": 0, 00:15:49.385 "data_size": 0 00:15:49.385 } 00:15:49.385 ] 00:15:49.385 }' 00:15:49.385 05:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.385 05:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:49.951 05:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:50.210 [2024-07-26 05:44:04.952595] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:50.210 BaseBdev2 00:15:50.210 05:44:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:50.210 05:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:50.210 05:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:50.210 05:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:50.210 05:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:50.210 05:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:50.210 05:44:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:50.468 05:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:50.726 [ 00:15:50.726 { 00:15:50.726 "name": "BaseBdev2", 00:15:50.726 "aliases": [ 00:15:50.726 "b83671ca-dcdb-4f9b-a0a9-2d67720fc2e7" 00:15:50.726 ], 00:15:50.726 "product_name": "Malloc disk", 00:15:50.726 "block_size": 512, 00:15:50.726 "num_blocks": 65536, 00:15:50.726 "uuid": "b83671ca-dcdb-4f9b-a0a9-2d67720fc2e7", 00:15:50.726 "assigned_rate_limits": { 00:15:50.726 "rw_ios_per_sec": 0, 00:15:50.726 "rw_mbytes_per_sec": 0, 00:15:50.726 "r_mbytes_per_sec": 0, 00:15:50.726 "w_mbytes_per_sec": 0 00:15:50.726 }, 00:15:50.726 "claimed": true, 00:15:50.726 "claim_type": "exclusive_write", 00:15:50.726 "zoned": false, 00:15:50.726 "supported_io_types": { 00:15:50.726 "read": true, 00:15:50.726 "write": true, 00:15:50.726 "unmap": true, 00:15:50.726 "flush": true, 00:15:50.726 "reset": true, 00:15:50.726 "nvme_admin": false, 00:15:50.726 "nvme_io": false, 00:15:50.726 "nvme_io_md": false, 00:15:50.726 "write_zeroes": true, 00:15:50.726 "zcopy": true, 00:15:50.726 "get_zone_info": false, 00:15:50.726 "zone_management": false, 00:15:50.726 "zone_append": false, 00:15:50.726 "compare": false, 00:15:50.726 "compare_and_write": false, 00:15:50.726 "abort": true, 00:15:50.726 "seek_hole": false, 00:15:50.726 "seek_data": false, 00:15:50.726 "copy": true, 00:15:50.726 "nvme_iov_md": false 00:15:50.726 }, 00:15:50.726 "memory_domains": [ 00:15:50.726 { 00:15:50.726 "dma_device_id": "system", 00:15:50.726 "dma_device_type": 1 00:15:50.726 }, 00:15:50.726 { 00:15:50.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.726 "dma_device_type": 2 00:15:50.726 } 00:15:50.726 ], 00:15:50.726 "driver_specific": {} 00:15:50.726 } 00:15:50.726 ] 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:50.726 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.727 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.727 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.727 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.727 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.727 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.984 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.984 "name": "Existed_Raid", 00:15:50.984 "uuid": "c3eff658-f198-4023-ab87-343c185a81bd", 00:15:50.984 "strip_size_kb": 64, 00:15:50.984 "state": "configuring", 00:15:50.984 "raid_level": "raid0", 00:15:50.984 "superblock": true, 00:15:50.984 "num_base_bdevs": 3, 00:15:50.984 "num_base_bdevs_discovered": 2, 00:15:50.984 "num_base_bdevs_operational": 3, 00:15:50.984 "base_bdevs_list": [ 00:15:50.984 { 00:15:50.984 "name": "BaseBdev1", 00:15:50.984 "uuid": "092fe639-368c-48d8-891d-97b757aa074d", 00:15:50.984 "is_configured": true, 00:15:50.984 "data_offset": 2048, 00:15:50.984 "data_size": 63488 00:15:50.984 }, 00:15:50.984 { 00:15:50.984 "name": "BaseBdev2", 00:15:50.984 "uuid": "b83671ca-dcdb-4f9b-a0a9-2d67720fc2e7", 00:15:50.984 "is_configured": true, 00:15:50.984 "data_offset": 2048, 00:15:50.984 "data_size": 63488 00:15:50.984 }, 00:15:50.984 { 00:15:50.984 "name": "BaseBdev3", 00:15:50.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.984 "is_configured": false, 00:15:50.984 "data_offset": 0, 00:15:50.984 "data_size": 0 00:15:50.984 } 00:15:50.984 ] 00:15:50.984 }' 00:15:50.985 05:44:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.985 05:44:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:51.551 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:51.551 [2024-07-26 05:44:06.448047] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:51.551 [2024-07-26 05:44:06.448204] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24ad400 00:15:51.551 [2024-07-26 05:44:06.448217] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:51.551 [2024-07-26 05:44:06.448384] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24acef0 00:15:51.551 [2024-07-26 05:44:06.448497] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24ad400 00:15:51.551 [2024-07-26 05:44:06.448507] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24ad400 00:15:51.551 [2024-07-26 05:44:06.448595] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:51.551 BaseBdev3 00:15:51.809 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:51.809 05:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:51.809 05:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:51.809 05:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:51.809 05:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:51.809 05:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:51.809 05:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:51.809 05:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:52.067 [ 00:15:52.067 { 00:15:52.067 "name": "BaseBdev3", 00:15:52.067 "aliases": [ 00:15:52.067 "ef062096-7b63-4f8b-94fb-b279e865270c" 00:15:52.067 ], 00:15:52.067 "product_name": "Malloc disk", 00:15:52.067 "block_size": 512, 00:15:52.067 "num_blocks": 65536, 00:15:52.067 "uuid": "ef062096-7b63-4f8b-94fb-b279e865270c", 00:15:52.067 "assigned_rate_limits": { 00:15:52.067 "rw_ios_per_sec": 0, 00:15:52.067 "rw_mbytes_per_sec": 0, 00:15:52.067 "r_mbytes_per_sec": 0, 00:15:52.067 "w_mbytes_per_sec": 0 00:15:52.067 }, 00:15:52.067 "claimed": true, 00:15:52.067 "claim_type": "exclusive_write", 00:15:52.067 "zoned": false, 00:15:52.067 "supported_io_types": { 00:15:52.067 "read": true, 00:15:52.067 "write": true, 00:15:52.067 "unmap": true, 00:15:52.067 "flush": true, 00:15:52.067 "reset": true, 00:15:52.067 "nvme_admin": false, 00:15:52.067 "nvme_io": false, 00:15:52.067 "nvme_io_md": false, 00:15:52.067 "write_zeroes": true, 00:15:52.067 "zcopy": true, 00:15:52.067 "get_zone_info": false, 00:15:52.067 "zone_management": false, 00:15:52.067 "zone_append": false, 00:15:52.067 "compare": false, 00:15:52.067 "compare_and_write": false, 00:15:52.067 "abort": true, 00:15:52.067 "seek_hole": false, 00:15:52.067 "seek_data": false, 00:15:52.067 "copy": true, 00:15:52.067 "nvme_iov_md": false 00:15:52.067 }, 00:15:52.067 "memory_domains": [ 00:15:52.067 { 00:15:52.067 "dma_device_id": "system", 00:15:52.067 "dma_device_type": 1 00:15:52.067 }, 00:15:52.067 { 00:15:52.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.067 "dma_device_type": 2 00:15:52.067 } 00:15:52.067 ], 00:15:52.067 "driver_specific": {} 00:15:52.067 } 00:15:52.067 ] 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.067 05:44:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.325 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.325 "name": "Existed_Raid", 00:15:52.325 "uuid": "c3eff658-f198-4023-ab87-343c185a81bd", 00:15:52.325 "strip_size_kb": 64, 00:15:52.325 "state": "online", 00:15:52.325 "raid_level": "raid0", 00:15:52.325 "superblock": true, 00:15:52.325 "num_base_bdevs": 3, 00:15:52.325 "num_base_bdevs_discovered": 3, 00:15:52.325 "num_base_bdevs_operational": 3, 00:15:52.325 "base_bdevs_list": [ 00:15:52.325 { 00:15:52.325 "name": "BaseBdev1", 00:15:52.325 "uuid": "092fe639-368c-48d8-891d-97b757aa074d", 00:15:52.325 "is_configured": true, 00:15:52.325 "data_offset": 2048, 00:15:52.325 "data_size": 63488 00:15:52.325 }, 00:15:52.325 { 00:15:52.325 "name": "BaseBdev2", 00:15:52.325 "uuid": "b83671ca-dcdb-4f9b-a0a9-2d67720fc2e7", 00:15:52.325 "is_configured": true, 00:15:52.325 "data_offset": 2048, 00:15:52.325 "data_size": 63488 00:15:52.325 }, 00:15:52.325 { 00:15:52.325 "name": "BaseBdev3", 00:15:52.325 "uuid": "ef062096-7b63-4f8b-94fb-b279e865270c", 00:15:52.325 "is_configured": true, 00:15:52.325 "data_offset": 2048, 00:15:52.325 "data_size": 63488 00:15:52.325 } 00:15:52.325 ] 00:15:52.325 }' 00:15:52.325 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.325 05:44:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:52.891 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:52.891 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:52.891 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:52.891 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:52.891 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:52.891 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:52.891 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:52.891 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:53.149 [2024-07-26 05:44:07.824000] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:53.149 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:53.149 "name": "Existed_Raid", 00:15:53.149 "aliases": [ 00:15:53.149 "c3eff658-f198-4023-ab87-343c185a81bd" 00:15:53.149 ], 00:15:53.149 "product_name": "Raid Volume", 00:15:53.149 "block_size": 512, 00:15:53.149 "num_blocks": 190464, 00:15:53.149 "uuid": "c3eff658-f198-4023-ab87-343c185a81bd", 00:15:53.149 "assigned_rate_limits": { 00:15:53.149 "rw_ios_per_sec": 0, 00:15:53.149 "rw_mbytes_per_sec": 0, 00:15:53.149 "r_mbytes_per_sec": 0, 00:15:53.149 "w_mbytes_per_sec": 0 00:15:53.149 }, 00:15:53.149 "claimed": false, 00:15:53.149 "zoned": false, 00:15:53.149 "supported_io_types": { 00:15:53.149 "read": true, 00:15:53.149 "write": true, 00:15:53.149 "unmap": true, 00:15:53.149 "flush": true, 00:15:53.149 "reset": true, 00:15:53.149 "nvme_admin": false, 00:15:53.149 "nvme_io": false, 00:15:53.149 "nvme_io_md": false, 00:15:53.149 "write_zeroes": true, 00:15:53.149 "zcopy": false, 00:15:53.149 "get_zone_info": false, 00:15:53.149 "zone_management": false, 00:15:53.149 "zone_append": false, 00:15:53.149 "compare": false, 00:15:53.149 "compare_and_write": false, 00:15:53.149 "abort": false, 00:15:53.149 "seek_hole": false, 00:15:53.149 "seek_data": false, 00:15:53.149 "copy": false, 00:15:53.149 "nvme_iov_md": false 00:15:53.149 }, 00:15:53.149 "memory_domains": [ 00:15:53.149 { 00:15:53.149 "dma_device_id": "system", 00:15:53.149 "dma_device_type": 1 00:15:53.149 }, 00:15:53.149 { 00:15:53.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.149 "dma_device_type": 2 00:15:53.149 }, 00:15:53.149 { 00:15:53.149 "dma_device_id": "system", 00:15:53.149 "dma_device_type": 1 00:15:53.149 }, 00:15:53.149 { 00:15:53.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.149 "dma_device_type": 2 00:15:53.149 }, 00:15:53.149 { 00:15:53.149 "dma_device_id": "system", 00:15:53.149 "dma_device_type": 1 00:15:53.149 }, 00:15:53.149 { 00:15:53.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.149 "dma_device_type": 2 00:15:53.149 } 00:15:53.149 ], 00:15:53.149 "driver_specific": { 00:15:53.149 "raid": { 00:15:53.149 "uuid": "c3eff658-f198-4023-ab87-343c185a81bd", 00:15:53.149 "strip_size_kb": 64, 00:15:53.149 "state": "online", 00:15:53.149 "raid_level": "raid0", 00:15:53.149 "superblock": true, 00:15:53.149 "num_base_bdevs": 3, 00:15:53.149 "num_base_bdevs_discovered": 3, 00:15:53.149 "num_base_bdevs_operational": 3, 00:15:53.149 "base_bdevs_list": [ 00:15:53.149 { 00:15:53.149 "name": "BaseBdev1", 00:15:53.149 "uuid": "092fe639-368c-48d8-891d-97b757aa074d", 00:15:53.149 "is_configured": true, 00:15:53.149 "data_offset": 2048, 00:15:53.149 "data_size": 63488 00:15:53.149 }, 00:15:53.149 { 00:15:53.149 "name": "BaseBdev2", 00:15:53.149 "uuid": "b83671ca-dcdb-4f9b-a0a9-2d67720fc2e7", 00:15:53.149 "is_configured": true, 00:15:53.149 "data_offset": 2048, 00:15:53.149 "data_size": 63488 00:15:53.149 }, 00:15:53.149 { 00:15:53.149 "name": "BaseBdev3", 00:15:53.149 "uuid": "ef062096-7b63-4f8b-94fb-b279e865270c", 00:15:53.149 "is_configured": true, 00:15:53.149 "data_offset": 2048, 00:15:53.149 "data_size": 63488 00:15:53.149 } 00:15:53.149 ] 00:15:53.149 } 00:15:53.149 } 00:15:53.149 }' 00:15:53.149 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:53.149 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:53.149 BaseBdev2 00:15:53.149 BaseBdev3' 00:15:53.149 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:53.149 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:53.149 05:44:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:53.407 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:53.407 "name": "BaseBdev1", 00:15:53.407 "aliases": [ 00:15:53.407 "092fe639-368c-48d8-891d-97b757aa074d" 00:15:53.407 ], 00:15:53.407 "product_name": "Malloc disk", 00:15:53.407 "block_size": 512, 00:15:53.407 "num_blocks": 65536, 00:15:53.407 "uuid": "092fe639-368c-48d8-891d-97b757aa074d", 00:15:53.407 "assigned_rate_limits": { 00:15:53.407 "rw_ios_per_sec": 0, 00:15:53.407 "rw_mbytes_per_sec": 0, 00:15:53.407 "r_mbytes_per_sec": 0, 00:15:53.407 "w_mbytes_per_sec": 0 00:15:53.407 }, 00:15:53.407 "claimed": true, 00:15:53.407 "claim_type": "exclusive_write", 00:15:53.407 "zoned": false, 00:15:53.407 "supported_io_types": { 00:15:53.407 "read": true, 00:15:53.407 "write": true, 00:15:53.407 "unmap": true, 00:15:53.407 "flush": true, 00:15:53.407 "reset": true, 00:15:53.407 "nvme_admin": false, 00:15:53.407 "nvme_io": false, 00:15:53.407 "nvme_io_md": false, 00:15:53.407 "write_zeroes": true, 00:15:53.407 "zcopy": true, 00:15:53.407 "get_zone_info": false, 00:15:53.407 "zone_management": false, 00:15:53.407 "zone_append": false, 00:15:53.407 "compare": false, 00:15:53.407 "compare_and_write": false, 00:15:53.407 "abort": true, 00:15:53.407 "seek_hole": false, 00:15:53.407 "seek_data": false, 00:15:53.407 "copy": true, 00:15:53.407 "nvme_iov_md": false 00:15:53.407 }, 00:15:53.407 "memory_domains": [ 00:15:53.407 { 00:15:53.407 "dma_device_id": "system", 00:15:53.407 "dma_device_type": 1 00:15:53.407 }, 00:15:53.407 { 00:15:53.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.407 "dma_device_type": 2 00:15:53.407 } 00:15:53.407 ], 00:15:53.407 "driver_specific": {} 00:15:53.407 }' 00:15:53.407 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.407 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.407 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:53.407 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.407 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.407 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:53.407 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.407 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:53.663 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:53.663 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.663 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:53.663 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:53.663 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:53.663 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:53.663 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:53.920 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:53.920 "name": "BaseBdev2", 00:15:53.920 "aliases": [ 00:15:53.920 "b83671ca-dcdb-4f9b-a0a9-2d67720fc2e7" 00:15:53.920 ], 00:15:53.920 "product_name": "Malloc disk", 00:15:53.920 "block_size": 512, 00:15:53.920 "num_blocks": 65536, 00:15:53.920 "uuid": "b83671ca-dcdb-4f9b-a0a9-2d67720fc2e7", 00:15:53.920 "assigned_rate_limits": { 00:15:53.920 "rw_ios_per_sec": 0, 00:15:53.920 "rw_mbytes_per_sec": 0, 00:15:53.920 "r_mbytes_per_sec": 0, 00:15:53.920 "w_mbytes_per_sec": 0 00:15:53.920 }, 00:15:53.920 "claimed": true, 00:15:53.920 "claim_type": "exclusive_write", 00:15:53.920 "zoned": false, 00:15:53.920 "supported_io_types": { 00:15:53.920 "read": true, 00:15:53.920 "write": true, 00:15:53.920 "unmap": true, 00:15:53.920 "flush": true, 00:15:53.920 "reset": true, 00:15:53.920 "nvme_admin": false, 00:15:53.920 "nvme_io": false, 00:15:53.920 "nvme_io_md": false, 00:15:53.920 "write_zeroes": true, 00:15:53.920 "zcopy": true, 00:15:53.920 "get_zone_info": false, 00:15:53.920 "zone_management": false, 00:15:53.920 "zone_append": false, 00:15:53.920 "compare": false, 00:15:53.920 "compare_and_write": false, 00:15:53.920 "abort": true, 00:15:53.920 "seek_hole": false, 00:15:53.920 "seek_data": false, 00:15:53.920 "copy": true, 00:15:53.920 "nvme_iov_md": false 00:15:53.920 }, 00:15:53.920 "memory_domains": [ 00:15:53.920 { 00:15:53.920 "dma_device_id": "system", 00:15:53.920 "dma_device_type": 1 00:15:53.920 }, 00:15:53.920 { 00:15:53.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.920 "dma_device_type": 2 00:15:53.920 } 00:15:53.920 ], 00:15:53.920 "driver_specific": {} 00:15:53.920 }' 00:15:53.920 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.920 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:53.920 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:53.920 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:53.920 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.177 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:54.177 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.177 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.177 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.177 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.177 05:44:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.177 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.177 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:54.177 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:54.177 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:54.434 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:54.434 "name": "BaseBdev3", 00:15:54.434 "aliases": [ 00:15:54.434 "ef062096-7b63-4f8b-94fb-b279e865270c" 00:15:54.434 ], 00:15:54.434 "product_name": "Malloc disk", 00:15:54.434 "block_size": 512, 00:15:54.434 "num_blocks": 65536, 00:15:54.434 "uuid": "ef062096-7b63-4f8b-94fb-b279e865270c", 00:15:54.434 "assigned_rate_limits": { 00:15:54.434 "rw_ios_per_sec": 0, 00:15:54.434 "rw_mbytes_per_sec": 0, 00:15:54.434 "r_mbytes_per_sec": 0, 00:15:54.434 "w_mbytes_per_sec": 0 00:15:54.434 }, 00:15:54.434 "claimed": true, 00:15:54.434 "claim_type": "exclusive_write", 00:15:54.434 "zoned": false, 00:15:54.434 "supported_io_types": { 00:15:54.434 "read": true, 00:15:54.434 "write": true, 00:15:54.434 "unmap": true, 00:15:54.434 "flush": true, 00:15:54.434 "reset": true, 00:15:54.434 "nvme_admin": false, 00:15:54.434 "nvme_io": false, 00:15:54.434 "nvme_io_md": false, 00:15:54.434 "write_zeroes": true, 00:15:54.434 "zcopy": true, 00:15:54.434 "get_zone_info": false, 00:15:54.434 "zone_management": false, 00:15:54.434 "zone_append": false, 00:15:54.434 "compare": false, 00:15:54.434 "compare_and_write": false, 00:15:54.434 "abort": true, 00:15:54.434 "seek_hole": false, 00:15:54.434 "seek_data": false, 00:15:54.434 "copy": true, 00:15:54.434 "nvme_iov_md": false 00:15:54.434 }, 00:15:54.434 "memory_domains": [ 00:15:54.434 { 00:15:54.434 "dma_device_id": "system", 00:15:54.434 "dma_device_type": 1 00:15:54.434 }, 00:15:54.434 { 00:15:54.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.434 "dma_device_type": 2 00:15:54.434 } 00:15:54.434 ], 00:15:54.434 "driver_specific": {} 00:15:54.434 }' 00:15:54.434 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.434 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:54.691 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:54.691 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.691 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:54.691 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:54.691 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.691 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:54.691 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.691 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.952 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:54.953 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.953 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:55.216 [2024-07-26 05:44:09.865159] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:55.216 [2024-07-26 05:44:09.865185] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:55.216 [2024-07-26 05:44:09.865225] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.216 05:44:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.216 05:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.216 "name": "Existed_Raid", 00:15:55.216 "uuid": "c3eff658-f198-4023-ab87-343c185a81bd", 00:15:55.216 "strip_size_kb": 64, 00:15:55.216 "state": "offline", 00:15:55.216 "raid_level": "raid0", 00:15:55.216 "superblock": true, 00:15:55.216 "num_base_bdevs": 3, 00:15:55.216 "num_base_bdevs_discovered": 2, 00:15:55.216 "num_base_bdevs_operational": 2, 00:15:55.216 "base_bdevs_list": [ 00:15:55.216 { 00:15:55.216 "name": null, 00:15:55.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.216 "is_configured": false, 00:15:55.216 "data_offset": 2048, 00:15:55.216 "data_size": 63488 00:15:55.216 }, 00:15:55.216 { 00:15:55.216 "name": "BaseBdev2", 00:15:55.216 "uuid": "b83671ca-dcdb-4f9b-a0a9-2d67720fc2e7", 00:15:55.216 "is_configured": true, 00:15:55.216 "data_offset": 2048, 00:15:55.216 "data_size": 63488 00:15:55.216 }, 00:15:55.216 { 00:15:55.216 "name": "BaseBdev3", 00:15:55.216 "uuid": "ef062096-7b63-4f8b-94fb-b279e865270c", 00:15:55.216 "is_configured": true, 00:15:55.216 "data_offset": 2048, 00:15:55.216 "data_size": 63488 00:15:55.216 } 00:15:55.216 ] 00:15:55.216 }' 00:15:55.216 05:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.216 05:44:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:55.781 05:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:55.781 05:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:55.781 05:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.781 05:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:56.039 05:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:56.039 05:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:56.039 05:44:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:56.296 [2024-07-26 05:44:11.133732] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:56.296 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:56.296 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:56.296 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.296 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:56.554 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:56.554 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:56.554 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:56.812 [2024-07-26 05:44:11.641755] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:56.812 [2024-07-26 05:44:11.641799] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24ad400 name Existed_Raid, state offline 00:15:56.812 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:56.812 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:56.812 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.812 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:57.070 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:57.070 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:57.070 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:57.070 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:57.070 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:57.070 05:44:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:57.328 BaseBdev2 00:15:57.328 05:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:57.328 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:57.328 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:57.328 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:57.328 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:57.328 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:57.328 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:57.586 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:57.845 [ 00:15:57.845 { 00:15:57.845 "name": "BaseBdev2", 00:15:57.845 "aliases": [ 00:15:57.845 "bf9fe0ff-495d-46c5-898e-760733f156d3" 00:15:57.845 ], 00:15:57.845 "product_name": "Malloc disk", 00:15:57.845 "block_size": 512, 00:15:57.845 "num_blocks": 65536, 00:15:57.845 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:15:57.845 "assigned_rate_limits": { 00:15:57.845 "rw_ios_per_sec": 0, 00:15:57.845 "rw_mbytes_per_sec": 0, 00:15:57.845 "r_mbytes_per_sec": 0, 00:15:57.845 "w_mbytes_per_sec": 0 00:15:57.845 }, 00:15:57.845 "claimed": false, 00:15:57.845 "zoned": false, 00:15:57.845 "supported_io_types": { 00:15:57.845 "read": true, 00:15:57.845 "write": true, 00:15:57.845 "unmap": true, 00:15:57.845 "flush": true, 00:15:57.845 "reset": true, 00:15:57.845 "nvme_admin": false, 00:15:57.845 "nvme_io": false, 00:15:57.845 "nvme_io_md": false, 00:15:57.845 "write_zeroes": true, 00:15:57.845 "zcopy": true, 00:15:57.845 "get_zone_info": false, 00:15:57.845 "zone_management": false, 00:15:57.845 "zone_append": false, 00:15:57.845 "compare": false, 00:15:57.845 "compare_and_write": false, 00:15:57.845 "abort": true, 00:15:57.845 "seek_hole": false, 00:15:57.845 "seek_data": false, 00:15:57.845 "copy": true, 00:15:57.845 "nvme_iov_md": false 00:15:57.845 }, 00:15:57.845 "memory_domains": [ 00:15:57.845 { 00:15:57.845 "dma_device_id": "system", 00:15:57.845 "dma_device_type": 1 00:15:57.845 }, 00:15:57.845 { 00:15:57.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.845 "dma_device_type": 2 00:15:57.845 } 00:15:57.845 ], 00:15:57.845 "driver_specific": {} 00:15:57.845 } 00:15:57.845 ] 00:15:57.845 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:57.845 05:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:57.845 05:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:57.845 05:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:58.103 BaseBdev3 00:15:58.103 05:44:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:58.103 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:58.103 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:58.103 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:58.103 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:58.103 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:58.103 05:44:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:58.360 05:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:58.619 [ 00:15:58.619 { 00:15:58.619 "name": "BaseBdev3", 00:15:58.619 "aliases": [ 00:15:58.619 "0a08beba-97d1-4dec-b344-c57d9fc23bf7" 00:15:58.619 ], 00:15:58.619 "product_name": "Malloc disk", 00:15:58.619 "block_size": 512, 00:15:58.619 "num_blocks": 65536, 00:15:58.619 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:15:58.619 "assigned_rate_limits": { 00:15:58.619 "rw_ios_per_sec": 0, 00:15:58.619 "rw_mbytes_per_sec": 0, 00:15:58.619 "r_mbytes_per_sec": 0, 00:15:58.619 "w_mbytes_per_sec": 0 00:15:58.619 }, 00:15:58.619 "claimed": false, 00:15:58.619 "zoned": false, 00:15:58.619 "supported_io_types": { 00:15:58.619 "read": true, 00:15:58.619 "write": true, 00:15:58.619 "unmap": true, 00:15:58.619 "flush": true, 00:15:58.619 "reset": true, 00:15:58.619 "nvme_admin": false, 00:15:58.619 "nvme_io": false, 00:15:58.619 "nvme_io_md": false, 00:15:58.619 "write_zeroes": true, 00:15:58.619 "zcopy": true, 00:15:58.619 "get_zone_info": false, 00:15:58.619 "zone_management": false, 00:15:58.619 "zone_append": false, 00:15:58.619 "compare": false, 00:15:58.619 "compare_and_write": false, 00:15:58.619 "abort": true, 00:15:58.619 "seek_hole": false, 00:15:58.619 "seek_data": false, 00:15:58.619 "copy": true, 00:15:58.619 "nvme_iov_md": false 00:15:58.619 }, 00:15:58.619 "memory_domains": [ 00:15:58.619 { 00:15:58.619 "dma_device_id": "system", 00:15:58.619 "dma_device_type": 1 00:15:58.619 }, 00:15:58.619 { 00:15:58.619 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.619 "dma_device_type": 2 00:15:58.619 } 00:15:58.619 ], 00:15:58.619 "driver_specific": {} 00:15:58.619 } 00:15:58.619 ] 00:15:58.619 05:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:58.619 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:58.619 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:58.619 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:58.877 [2024-07-26 05:44:13.617135] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:58.877 [2024-07-26 05:44:13.617174] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:58.877 [2024-07-26 05:44:13.617194] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:58.877 [2024-07-26 05:44:13.618567] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.877 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.135 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.135 "name": "Existed_Raid", 00:15:59.135 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:15:59.135 "strip_size_kb": 64, 00:15:59.135 "state": "configuring", 00:15:59.135 "raid_level": "raid0", 00:15:59.135 "superblock": true, 00:15:59.135 "num_base_bdevs": 3, 00:15:59.135 "num_base_bdevs_discovered": 2, 00:15:59.135 "num_base_bdevs_operational": 3, 00:15:59.135 "base_bdevs_list": [ 00:15:59.135 { 00:15:59.135 "name": "BaseBdev1", 00:15:59.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.135 "is_configured": false, 00:15:59.135 "data_offset": 0, 00:15:59.135 "data_size": 0 00:15:59.135 }, 00:15:59.135 { 00:15:59.135 "name": "BaseBdev2", 00:15:59.135 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:15:59.135 "is_configured": true, 00:15:59.135 "data_offset": 2048, 00:15:59.135 "data_size": 63488 00:15:59.135 }, 00:15:59.135 { 00:15:59.135 "name": "BaseBdev3", 00:15:59.135 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:15:59.135 "is_configured": true, 00:15:59.135 "data_offset": 2048, 00:15:59.135 "data_size": 63488 00:15:59.135 } 00:15:59.135 ] 00:15:59.135 }' 00:15:59.135 05:44:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.135 05:44:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:59.701 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:59.959 [2024-07-26 05:44:14.703986] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.959 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.217 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.217 "name": "Existed_Raid", 00:16:00.217 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:16:00.217 "strip_size_kb": 64, 00:16:00.217 "state": "configuring", 00:16:00.217 "raid_level": "raid0", 00:16:00.217 "superblock": true, 00:16:00.217 "num_base_bdevs": 3, 00:16:00.217 "num_base_bdevs_discovered": 1, 00:16:00.217 "num_base_bdevs_operational": 3, 00:16:00.217 "base_bdevs_list": [ 00:16:00.217 { 00:16:00.217 "name": "BaseBdev1", 00:16:00.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.217 "is_configured": false, 00:16:00.217 "data_offset": 0, 00:16:00.217 "data_size": 0 00:16:00.217 }, 00:16:00.217 { 00:16:00.217 "name": null, 00:16:00.217 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:16:00.217 "is_configured": false, 00:16:00.217 "data_offset": 2048, 00:16:00.217 "data_size": 63488 00:16:00.217 }, 00:16:00.217 { 00:16:00.217 "name": "BaseBdev3", 00:16:00.217 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:16:00.217 "is_configured": true, 00:16:00.217 "data_offset": 2048, 00:16:00.217 "data_size": 63488 00:16:00.217 } 00:16:00.217 ] 00:16:00.217 }' 00:16:00.217 05:44:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.217 05:44:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:00.783 05:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.783 05:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:01.042 05:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:01.042 05:44:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:01.301 [2024-07-26 05:44:16.028088] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:01.301 BaseBdev1 00:16:01.301 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:01.301 05:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:01.301 05:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:01.301 05:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:01.301 05:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:01.301 05:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:01.301 05:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:01.560 05:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:01.819 [ 00:16:01.819 { 00:16:01.819 "name": "BaseBdev1", 00:16:01.819 "aliases": [ 00:16:01.819 "e37dc72d-5aa6-44f8-bce2-bf545470650f" 00:16:01.819 ], 00:16:01.819 "product_name": "Malloc disk", 00:16:01.819 "block_size": 512, 00:16:01.819 "num_blocks": 65536, 00:16:01.819 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:01.819 "assigned_rate_limits": { 00:16:01.819 "rw_ios_per_sec": 0, 00:16:01.819 "rw_mbytes_per_sec": 0, 00:16:01.819 "r_mbytes_per_sec": 0, 00:16:01.819 "w_mbytes_per_sec": 0 00:16:01.819 }, 00:16:01.819 "claimed": true, 00:16:01.819 "claim_type": "exclusive_write", 00:16:01.819 "zoned": false, 00:16:01.819 "supported_io_types": { 00:16:01.819 "read": true, 00:16:01.819 "write": true, 00:16:01.819 "unmap": true, 00:16:01.819 "flush": true, 00:16:01.819 "reset": true, 00:16:01.819 "nvme_admin": false, 00:16:01.819 "nvme_io": false, 00:16:01.819 "nvme_io_md": false, 00:16:01.819 "write_zeroes": true, 00:16:01.819 "zcopy": true, 00:16:01.819 "get_zone_info": false, 00:16:01.819 "zone_management": false, 00:16:01.819 "zone_append": false, 00:16:01.819 "compare": false, 00:16:01.819 "compare_and_write": false, 00:16:01.819 "abort": true, 00:16:01.819 "seek_hole": false, 00:16:01.819 "seek_data": false, 00:16:01.819 "copy": true, 00:16:01.819 "nvme_iov_md": false 00:16:01.819 }, 00:16:01.819 "memory_domains": [ 00:16:01.819 { 00:16:01.819 "dma_device_id": "system", 00:16:01.819 "dma_device_type": 1 00:16:01.819 }, 00:16:01.819 { 00:16:01.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.819 "dma_device_type": 2 00:16:01.819 } 00:16:01.819 ], 00:16:01.819 "driver_specific": {} 00:16:01.819 } 00:16:01.819 ] 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.819 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.079 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.079 "name": "Existed_Raid", 00:16:02.079 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:16:02.079 "strip_size_kb": 64, 00:16:02.079 "state": "configuring", 00:16:02.079 "raid_level": "raid0", 00:16:02.079 "superblock": true, 00:16:02.079 "num_base_bdevs": 3, 00:16:02.079 "num_base_bdevs_discovered": 2, 00:16:02.079 "num_base_bdevs_operational": 3, 00:16:02.079 "base_bdevs_list": [ 00:16:02.079 { 00:16:02.079 "name": "BaseBdev1", 00:16:02.079 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:02.079 "is_configured": true, 00:16:02.079 "data_offset": 2048, 00:16:02.079 "data_size": 63488 00:16:02.079 }, 00:16:02.079 { 00:16:02.079 "name": null, 00:16:02.079 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:16:02.079 "is_configured": false, 00:16:02.079 "data_offset": 2048, 00:16:02.079 "data_size": 63488 00:16:02.079 }, 00:16:02.079 { 00:16:02.079 "name": "BaseBdev3", 00:16:02.079 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:16:02.079 "is_configured": true, 00:16:02.079 "data_offset": 2048, 00:16:02.079 "data_size": 63488 00:16:02.079 } 00:16:02.079 ] 00:16:02.079 }' 00:16:02.079 05:44:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.079 05:44:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:02.647 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:02.647 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.906 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:02.906 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:03.164 [2024-07-26 05:44:17.864977] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:03.164 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:03.164 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.164 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.164 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:03.164 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.164 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.164 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.164 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.164 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.165 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.165 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.165 05:44:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.423 05:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.423 "name": "Existed_Raid", 00:16:03.423 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:16:03.424 "strip_size_kb": 64, 00:16:03.424 "state": "configuring", 00:16:03.424 "raid_level": "raid0", 00:16:03.424 "superblock": true, 00:16:03.424 "num_base_bdevs": 3, 00:16:03.424 "num_base_bdevs_discovered": 1, 00:16:03.424 "num_base_bdevs_operational": 3, 00:16:03.424 "base_bdevs_list": [ 00:16:03.424 { 00:16:03.424 "name": "BaseBdev1", 00:16:03.424 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:03.424 "is_configured": true, 00:16:03.424 "data_offset": 2048, 00:16:03.424 "data_size": 63488 00:16:03.424 }, 00:16:03.424 { 00:16:03.424 "name": null, 00:16:03.424 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:16:03.424 "is_configured": false, 00:16:03.424 "data_offset": 2048, 00:16:03.424 "data_size": 63488 00:16:03.424 }, 00:16:03.424 { 00:16:03.424 "name": null, 00:16:03.424 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:16:03.424 "is_configured": false, 00:16:03.424 "data_offset": 2048, 00:16:03.424 "data_size": 63488 00:16:03.424 } 00:16:03.424 ] 00:16:03.424 }' 00:16:03.424 05:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.424 05:44:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.991 05:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.991 05:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:04.250 05:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:04.250 05:44:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:04.509 [2024-07-26 05:44:19.220602] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.509 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.768 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.768 "name": "Existed_Raid", 00:16:04.768 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:16:04.768 "strip_size_kb": 64, 00:16:04.768 "state": "configuring", 00:16:04.768 "raid_level": "raid0", 00:16:04.768 "superblock": true, 00:16:04.768 "num_base_bdevs": 3, 00:16:04.768 "num_base_bdevs_discovered": 2, 00:16:04.768 "num_base_bdevs_operational": 3, 00:16:04.768 "base_bdevs_list": [ 00:16:04.768 { 00:16:04.768 "name": "BaseBdev1", 00:16:04.768 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:04.768 "is_configured": true, 00:16:04.768 "data_offset": 2048, 00:16:04.768 "data_size": 63488 00:16:04.768 }, 00:16:04.768 { 00:16:04.768 "name": null, 00:16:04.768 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:16:04.768 "is_configured": false, 00:16:04.768 "data_offset": 2048, 00:16:04.768 "data_size": 63488 00:16:04.768 }, 00:16:04.768 { 00:16:04.768 "name": "BaseBdev3", 00:16:04.768 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:16:04.768 "is_configured": true, 00:16:04.768 "data_offset": 2048, 00:16:04.768 "data_size": 63488 00:16:04.768 } 00:16:04.768 ] 00:16:04.768 }' 00:16:04.768 05:44:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.768 05:44:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:05.336 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:05.336 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.595 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:05.595 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:05.873 [2024-07-26 05:44:20.556155] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.873 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.134 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.134 "name": "Existed_Raid", 00:16:06.134 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:16:06.134 "strip_size_kb": 64, 00:16:06.134 "state": "configuring", 00:16:06.134 "raid_level": "raid0", 00:16:06.134 "superblock": true, 00:16:06.134 "num_base_bdevs": 3, 00:16:06.134 "num_base_bdevs_discovered": 1, 00:16:06.134 "num_base_bdevs_operational": 3, 00:16:06.134 "base_bdevs_list": [ 00:16:06.134 { 00:16:06.134 "name": null, 00:16:06.134 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:06.134 "is_configured": false, 00:16:06.134 "data_offset": 2048, 00:16:06.134 "data_size": 63488 00:16:06.134 }, 00:16:06.134 { 00:16:06.134 "name": null, 00:16:06.134 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:16:06.134 "is_configured": false, 00:16:06.134 "data_offset": 2048, 00:16:06.134 "data_size": 63488 00:16:06.134 }, 00:16:06.134 { 00:16:06.134 "name": "BaseBdev3", 00:16:06.134 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:16:06.134 "is_configured": true, 00:16:06.134 "data_offset": 2048, 00:16:06.134 "data_size": 63488 00:16:06.134 } 00:16:06.134 ] 00:16:06.134 }' 00:16:06.134 05:44:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.134 05:44:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:06.702 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.702 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:06.961 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:06.961 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:06.961 [2024-07-26 05:44:21.868025] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.220 05:44:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.479 05:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.479 "name": "Existed_Raid", 00:16:07.479 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:16:07.479 "strip_size_kb": 64, 00:16:07.479 "state": "configuring", 00:16:07.479 "raid_level": "raid0", 00:16:07.479 "superblock": true, 00:16:07.479 "num_base_bdevs": 3, 00:16:07.479 "num_base_bdevs_discovered": 2, 00:16:07.479 "num_base_bdevs_operational": 3, 00:16:07.479 "base_bdevs_list": [ 00:16:07.479 { 00:16:07.479 "name": null, 00:16:07.479 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:07.479 "is_configured": false, 00:16:07.479 "data_offset": 2048, 00:16:07.479 "data_size": 63488 00:16:07.479 }, 00:16:07.479 { 00:16:07.479 "name": "BaseBdev2", 00:16:07.479 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:16:07.479 "is_configured": true, 00:16:07.479 "data_offset": 2048, 00:16:07.479 "data_size": 63488 00:16:07.479 }, 00:16:07.479 { 00:16:07.479 "name": "BaseBdev3", 00:16:07.479 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:16:07.479 "is_configured": true, 00:16:07.479 "data_offset": 2048, 00:16:07.479 "data_size": 63488 00:16:07.479 } 00:16:07.479 ] 00:16:07.479 }' 00:16:07.479 05:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.479 05:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:08.047 05:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.047 05:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:08.306 05:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:08.306 05:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.306 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:08.565 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e37dc72d-5aa6-44f8-bce2-bf545470650f 00:16:08.823 [2024-07-26 05:44:23.476783] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:08.823 [2024-07-26 05:44:23.476925] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24abe90 00:16:08.823 [2024-07-26 05:44:23.476939] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:08.823 [2024-07-26 05:44:23.477111] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21b2940 00:16:08.823 [2024-07-26 05:44:23.477225] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24abe90 00:16:08.823 [2024-07-26 05:44:23.477235] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24abe90 00:16:08.823 [2024-07-26 05:44:23.477325] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:08.823 NewBaseBdev 00:16:08.824 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:08.824 05:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:08.824 05:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:08.824 05:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:08.824 05:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:08.824 05:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:08.824 05:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:08.824 05:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:09.082 [ 00:16:09.082 { 00:16:09.082 "name": "NewBaseBdev", 00:16:09.082 "aliases": [ 00:16:09.082 "e37dc72d-5aa6-44f8-bce2-bf545470650f" 00:16:09.082 ], 00:16:09.082 "product_name": "Malloc disk", 00:16:09.082 "block_size": 512, 00:16:09.082 "num_blocks": 65536, 00:16:09.082 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:09.082 "assigned_rate_limits": { 00:16:09.082 "rw_ios_per_sec": 0, 00:16:09.082 "rw_mbytes_per_sec": 0, 00:16:09.082 "r_mbytes_per_sec": 0, 00:16:09.082 "w_mbytes_per_sec": 0 00:16:09.082 }, 00:16:09.082 "claimed": true, 00:16:09.082 "claim_type": "exclusive_write", 00:16:09.082 "zoned": false, 00:16:09.082 "supported_io_types": { 00:16:09.082 "read": true, 00:16:09.082 "write": true, 00:16:09.082 "unmap": true, 00:16:09.082 "flush": true, 00:16:09.082 "reset": true, 00:16:09.082 "nvme_admin": false, 00:16:09.082 "nvme_io": false, 00:16:09.082 "nvme_io_md": false, 00:16:09.082 "write_zeroes": true, 00:16:09.082 "zcopy": true, 00:16:09.082 "get_zone_info": false, 00:16:09.082 "zone_management": false, 00:16:09.082 "zone_append": false, 00:16:09.082 "compare": false, 00:16:09.082 "compare_and_write": false, 00:16:09.082 "abort": true, 00:16:09.082 "seek_hole": false, 00:16:09.082 "seek_data": false, 00:16:09.082 "copy": true, 00:16:09.082 "nvme_iov_md": false 00:16:09.082 }, 00:16:09.082 "memory_domains": [ 00:16:09.082 { 00:16:09.082 "dma_device_id": "system", 00:16:09.082 "dma_device_type": 1 00:16:09.082 }, 00:16:09.082 { 00:16:09.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.082 "dma_device_type": 2 00:16:09.082 } 00:16:09.082 ], 00:16:09.082 "driver_specific": {} 00:16:09.082 } 00:16:09.082 ] 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.082 05:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.340 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.340 "name": "Existed_Raid", 00:16:09.340 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:16:09.340 "strip_size_kb": 64, 00:16:09.340 "state": "online", 00:16:09.340 "raid_level": "raid0", 00:16:09.340 "superblock": true, 00:16:09.340 "num_base_bdevs": 3, 00:16:09.340 "num_base_bdevs_discovered": 3, 00:16:09.340 "num_base_bdevs_operational": 3, 00:16:09.340 "base_bdevs_list": [ 00:16:09.340 { 00:16:09.340 "name": "NewBaseBdev", 00:16:09.340 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:09.340 "is_configured": true, 00:16:09.340 "data_offset": 2048, 00:16:09.340 "data_size": 63488 00:16:09.340 }, 00:16:09.340 { 00:16:09.340 "name": "BaseBdev2", 00:16:09.340 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:16:09.340 "is_configured": true, 00:16:09.340 "data_offset": 2048, 00:16:09.340 "data_size": 63488 00:16:09.340 }, 00:16:09.340 { 00:16:09.340 "name": "BaseBdev3", 00:16:09.340 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:16:09.340 "is_configured": true, 00:16:09.340 "data_offset": 2048, 00:16:09.341 "data_size": 63488 00:16:09.341 } 00:16:09.341 ] 00:16:09.341 }' 00:16:09.341 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.341 05:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:09.907 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:09.907 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:09.907 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:09.907 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:09.907 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:09.907 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:09.907 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:10.166 05:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:10.166 [2024-07-26 05:44:25.041223] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:10.166 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:10.166 "name": "Existed_Raid", 00:16:10.166 "aliases": [ 00:16:10.166 "91ec14fa-f971-4896-ab5d-8ed29f578e66" 00:16:10.166 ], 00:16:10.166 "product_name": "Raid Volume", 00:16:10.166 "block_size": 512, 00:16:10.166 "num_blocks": 190464, 00:16:10.166 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:16:10.166 "assigned_rate_limits": { 00:16:10.166 "rw_ios_per_sec": 0, 00:16:10.166 "rw_mbytes_per_sec": 0, 00:16:10.166 "r_mbytes_per_sec": 0, 00:16:10.166 "w_mbytes_per_sec": 0 00:16:10.166 }, 00:16:10.166 "claimed": false, 00:16:10.166 "zoned": false, 00:16:10.166 "supported_io_types": { 00:16:10.166 "read": true, 00:16:10.166 "write": true, 00:16:10.166 "unmap": true, 00:16:10.166 "flush": true, 00:16:10.166 "reset": true, 00:16:10.166 "nvme_admin": false, 00:16:10.166 "nvme_io": false, 00:16:10.166 "nvme_io_md": false, 00:16:10.166 "write_zeroes": true, 00:16:10.166 "zcopy": false, 00:16:10.166 "get_zone_info": false, 00:16:10.166 "zone_management": false, 00:16:10.166 "zone_append": false, 00:16:10.166 "compare": false, 00:16:10.166 "compare_and_write": false, 00:16:10.166 "abort": false, 00:16:10.166 "seek_hole": false, 00:16:10.166 "seek_data": false, 00:16:10.166 "copy": false, 00:16:10.166 "nvme_iov_md": false 00:16:10.166 }, 00:16:10.166 "memory_domains": [ 00:16:10.166 { 00:16:10.166 "dma_device_id": "system", 00:16:10.166 "dma_device_type": 1 00:16:10.166 }, 00:16:10.166 { 00:16:10.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.166 "dma_device_type": 2 00:16:10.166 }, 00:16:10.166 { 00:16:10.166 "dma_device_id": "system", 00:16:10.166 "dma_device_type": 1 00:16:10.166 }, 00:16:10.166 { 00:16:10.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.166 "dma_device_type": 2 00:16:10.166 }, 00:16:10.166 { 00:16:10.166 "dma_device_id": "system", 00:16:10.166 "dma_device_type": 1 00:16:10.166 }, 00:16:10.166 { 00:16:10.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.167 "dma_device_type": 2 00:16:10.167 } 00:16:10.167 ], 00:16:10.167 "driver_specific": { 00:16:10.167 "raid": { 00:16:10.167 "uuid": "91ec14fa-f971-4896-ab5d-8ed29f578e66", 00:16:10.167 "strip_size_kb": 64, 00:16:10.167 "state": "online", 00:16:10.167 "raid_level": "raid0", 00:16:10.167 "superblock": true, 00:16:10.167 "num_base_bdevs": 3, 00:16:10.167 "num_base_bdevs_discovered": 3, 00:16:10.167 "num_base_bdevs_operational": 3, 00:16:10.167 "base_bdevs_list": [ 00:16:10.167 { 00:16:10.167 "name": "NewBaseBdev", 00:16:10.167 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:10.167 "is_configured": true, 00:16:10.167 "data_offset": 2048, 00:16:10.167 "data_size": 63488 00:16:10.167 }, 00:16:10.167 { 00:16:10.167 "name": "BaseBdev2", 00:16:10.167 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:16:10.167 "is_configured": true, 00:16:10.167 "data_offset": 2048, 00:16:10.167 "data_size": 63488 00:16:10.167 }, 00:16:10.167 { 00:16:10.167 "name": "BaseBdev3", 00:16:10.167 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:16:10.167 "is_configured": true, 00:16:10.167 "data_offset": 2048, 00:16:10.167 "data_size": 63488 00:16:10.167 } 00:16:10.167 ] 00:16:10.167 } 00:16:10.167 } 00:16:10.167 }' 00:16:10.167 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:10.448 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:10.448 BaseBdev2 00:16:10.448 BaseBdev3' 00:16:10.448 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:10.448 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:10.448 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:10.796 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:10.796 "name": "NewBaseBdev", 00:16:10.796 "aliases": [ 00:16:10.796 "e37dc72d-5aa6-44f8-bce2-bf545470650f" 00:16:10.796 ], 00:16:10.796 "product_name": "Malloc disk", 00:16:10.796 "block_size": 512, 00:16:10.796 "num_blocks": 65536, 00:16:10.796 "uuid": "e37dc72d-5aa6-44f8-bce2-bf545470650f", 00:16:10.796 "assigned_rate_limits": { 00:16:10.796 "rw_ios_per_sec": 0, 00:16:10.796 "rw_mbytes_per_sec": 0, 00:16:10.796 "r_mbytes_per_sec": 0, 00:16:10.796 "w_mbytes_per_sec": 0 00:16:10.796 }, 00:16:10.796 "claimed": true, 00:16:10.796 "claim_type": "exclusive_write", 00:16:10.796 "zoned": false, 00:16:10.796 "supported_io_types": { 00:16:10.796 "read": true, 00:16:10.796 "write": true, 00:16:10.796 "unmap": true, 00:16:10.796 "flush": true, 00:16:10.796 "reset": true, 00:16:10.796 "nvme_admin": false, 00:16:10.796 "nvme_io": false, 00:16:10.796 "nvme_io_md": false, 00:16:10.796 "write_zeroes": true, 00:16:10.796 "zcopy": true, 00:16:10.796 "get_zone_info": false, 00:16:10.796 "zone_management": false, 00:16:10.796 "zone_append": false, 00:16:10.796 "compare": false, 00:16:10.796 "compare_and_write": false, 00:16:10.796 "abort": true, 00:16:10.796 "seek_hole": false, 00:16:10.796 "seek_data": false, 00:16:10.796 "copy": true, 00:16:10.796 "nvme_iov_md": false 00:16:10.796 }, 00:16:10.796 "memory_domains": [ 00:16:10.796 { 00:16:10.796 "dma_device_id": "system", 00:16:10.796 "dma_device_type": 1 00:16:10.796 }, 00:16:10.796 { 00:16:10.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.796 "dma_device_type": 2 00:16:10.796 } 00:16:10.796 ], 00:16:10.796 "driver_specific": {} 00:16:10.796 }' 00:16:10.796 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:10.796 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.081 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.081 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.081 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.081 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.081 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.081 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.081 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.081 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.081 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.340 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.340 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.340 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:11.340 05:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.340 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.340 "name": "BaseBdev2", 00:16:11.340 "aliases": [ 00:16:11.340 "bf9fe0ff-495d-46c5-898e-760733f156d3" 00:16:11.340 ], 00:16:11.340 "product_name": "Malloc disk", 00:16:11.340 "block_size": 512, 00:16:11.340 "num_blocks": 65536, 00:16:11.340 "uuid": "bf9fe0ff-495d-46c5-898e-760733f156d3", 00:16:11.340 "assigned_rate_limits": { 00:16:11.340 "rw_ios_per_sec": 0, 00:16:11.340 "rw_mbytes_per_sec": 0, 00:16:11.340 "r_mbytes_per_sec": 0, 00:16:11.340 "w_mbytes_per_sec": 0 00:16:11.340 }, 00:16:11.340 "claimed": true, 00:16:11.340 "claim_type": "exclusive_write", 00:16:11.340 "zoned": false, 00:16:11.340 "supported_io_types": { 00:16:11.340 "read": true, 00:16:11.340 "write": true, 00:16:11.340 "unmap": true, 00:16:11.340 "flush": true, 00:16:11.340 "reset": true, 00:16:11.340 "nvme_admin": false, 00:16:11.340 "nvme_io": false, 00:16:11.340 "nvme_io_md": false, 00:16:11.340 "write_zeroes": true, 00:16:11.340 "zcopy": true, 00:16:11.340 "get_zone_info": false, 00:16:11.340 "zone_management": false, 00:16:11.340 "zone_append": false, 00:16:11.340 "compare": false, 00:16:11.340 "compare_and_write": false, 00:16:11.340 "abort": true, 00:16:11.340 "seek_hole": false, 00:16:11.340 "seek_data": false, 00:16:11.340 "copy": true, 00:16:11.340 "nvme_iov_md": false 00:16:11.340 }, 00:16:11.340 "memory_domains": [ 00:16:11.340 { 00:16:11.340 "dma_device_id": "system", 00:16:11.340 "dma_device_type": 1 00:16:11.340 }, 00:16:11.340 { 00:16:11.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.340 "dma_device_type": 2 00:16:11.340 } 00:16:11.340 ], 00:16:11.340 "driver_specific": {} 00:16:11.340 }' 00:16:11.340 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.599 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:11.857 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:11.858 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.858 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:11.858 05:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:12.425 "name": "BaseBdev3", 00:16:12.425 "aliases": [ 00:16:12.425 "0a08beba-97d1-4dec-b344-c57d9fc23bf7" 00:16:12.425 ], 00:16:12.425 "product_name": "Malloc disk", 00:16:12.425 "block_size": 512, 00:16:12.425 "num_blocks": 65536, 00:16:12.425 "uuid": "0a08beba-97d1-4dec-b344-c57d9fc23bf7", 00:16:12.425 "assigned_rate_limits": { 00:16:12.425 "rw_ios_per_sec": 0, 00:16:12.425 "rw_mbytes_per_sec": 0, 00:16:12.425 "r_mbytes_per_sec": 0, 00:16:12.425 "w_mbytes_per_sec": 0 00:16:12.425 }, 00:16:12.425 "claimed": true, 00:16:12.425 "claim_type": "exclusive_write", 00:16:12.425 "zoned": false, 00:16:12.425 "supported_io_types": { 00:16:12.425 "read": true, 00:16:12.425 "write": true, 00:16:12.425 "unmap": true, 00:16:12.425 "flush": true, 00:16:12.425 "reset": true, 00:16:12.425 "nvme_admin": false, 00:16:12.425 "nvme_io": false, 00:16:12.425 "nvme_io_md": false, 00:16:12.425 "write_zeroes": true, 00:16:12.425 "zcopy": true, 00:16:12.425 "get_zone_info": false, 00:16:12.425 "zone_management": false, 00:16:12.425 "zone_append": false, 00:16:12.425 "compare": false, 00:16:12.425 "compare_and_write": false, 00:16:12.425 "abort": true, 00:16:12.425 "seek_hole": false, 00:16:12.425 "seek_data": false, 00:16:12.425 "copy": true, 00:16:12.425 "nvme_iov_md": false 00:16:12.425 }, 00:16:12.425 "memory_domains": [ 00:16:12.425 { 00:16:12.425 "dma_device_id": "system", 00:16:12.425 "dma_device_type": 1 00:16:12.425 }, 00:16:12.425 { 00:16:12.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.425 "dma_device_type": 2 00:16:12.425 } 00:16:12.425 ], 00:16:12.425 "driver_specific": {} 00:16:12.425 }' 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.425 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.684 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.684 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:12.684 [2024-07-26 05:44:27.559630] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:12.685 [2024-07-26 05:44:27.559664] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:12.685 [2024-07-26 05:44:27.559713] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:12.685 [2024-07-26 05:44:27.559766] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:12.685 [2024-07-26 05:44:27.559778] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24abe90 name Existed_Raid, state offline 00:16:12.685 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1151813 00:16:12.685 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1151813 ']' 00:16:12.685 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1151813 00:16:12.685 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:12.685 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:12.685 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1151813 00:16:12.944 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:12.944 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:12.944 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1151813' 00:16:12.944 killing process with pid 1151813 00:16:12.944 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1151813 00:16:12.944 [2024-07-26 05:44:27.619161] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:12.944 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1151813 00:16:12.944 [2024-07-26 05:44:27.646915] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:13.203 05:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:13.203 00:16:13.203 real 0m28.638s 00:16:13.203 user 0m52.432s 00:16:13.203 sys 0m5.237s 00:16:13.203 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:13.203 05:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:13.203 ************************************ 00:16:13.203 END TEST raid_state_function_test_sb 00:16:13.203 ************************************ 00:16:13.203 05:44:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:13.203 05:44:27 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:16:13.203 05:44:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:13.203 05:44:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:13.204 05:44:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:13.204 ************************************ 00:16:13.204 START TEST raid_superblock_test 00:16:13.204 ************************************ 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1156244 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1156244 /var/tmp/spdk-raid.sock 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1156244 ']' 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:13.204 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:13.204 05:44:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.204 [2024-07-26 05:44:28.020727] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:16:13.204 [2024-07-26 05:44:28.020791] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1156244 ] 00:16:13.463 [2024-07-26 05:44:28.140313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:13.463 [2024-07-26 05:44:28.245611] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:13.463 [2024-07-26 05:44:28.314396] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:13.463 [2024-07-26 05:44:28.314433] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:14.399 05:44:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:14.399 malloc1 00:16:14.399 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:14.658 [2024-07-26 05:44:29.428959] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:14.658 [2024-07-26 05:44:29.429009] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:14.658 [2024-07-26 05:44:29.429030] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x172d570 00:16:14.658 [2024-07-26 05:44:29.429044] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:14.658 [2024-07-26 05:44:29.430789] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:14.658 [2024-07-26 05:44:29.430817] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:14.658 pt1 00:16:14.658 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:14.658 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:14.658 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:14.658 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:14.658 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:14.658 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:14.658 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:14.658 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:14.658 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:14.917 malloc2 00:16:14.917 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:15.175 [2024-07-26 05:44:29.916246] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:15.175 [2024-07-26 05:44:29.916290] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:15.175 [2024-07-26 05:44:29.916307] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x172e970 00:16:15.175 [2024-07-26 05:44:29.916319] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:15.175 [2024-07-26 05:44:29.917973] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:15.175 [2024-07-26 05:44:29.918001] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:15.175 pt2 00:16:15.175 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:15.175 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:15.175 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:15.175 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:15.175 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:15.175 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:15.175 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:15.175 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:15.175 05:44:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:15.434 malloc3 00:16:15.434 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:15.694 [2024-07-26 05:44:30.399287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:15.694 [2024-07-26 05:44:30.399333] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:15.694 [2024-07-26 05:44:30.399350] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18c5340 00:16:15.694 [2024-07-26 05:44:30.399362] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:15.694 [2024-07-26 05:44:30.400899] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:15.694 [2024-07-26 05:44:30.400927] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:15.694 pt3 00:16:15.694 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:15.694 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:15.694 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:15.953 [2024-07-26 05:44:30.635927] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:15.953 [2024-07-26 05:44:30.637240] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:15.953 [2024-07-26 05:44:30.637294] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:15.953 [2024-07-26 05:44:30.637443] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1725ea0 00:16:15.953 [2024-07-26 05:44:30.637454] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:15.953 [2024-07-26 05:44:30.637663] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x172d240 00:16:15.953 [2024-07-26 05:44:30.637805] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1725ea0 00:16:15.953 [2024-07-26 05:44:30.637815] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1725ea0 00:16:15.953 [2024-07-26 05:44:30.637913] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:15.953 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.212 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.212 "name": "raid_bdev1", 00:16:16.212 "uuid": "42914638-5683-4930-bd85-233a06262167", 00:16:16.212 "strip_size_kb": 64, 00:16:16.212 "state": "online", 00:16:16.212 "raid_level": "raid0", 00:16:16.212 "superblock": true, 00:16:16.212 "num_base_bdevs": 3, 00:16:16.212 "num_base_bdevs_discovered": 3, 00:16:16.212 "num_base_bdevs_operational": 3, 00:16:16.212 "base_bdevs_list": [ 00:16:16.212 { 00:16:16.212 "name": "pt1", 00:16:16.212 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:16.212 "is_configured": true, 00:16:16.212 "data_offset": 2048, 00:16:16.212 "data_size": 63488 00:16:16.212 }, 00:16:16.212 { 00:16:16.212 "name": "pt2", 00:16:16.212 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:16.212 "is_configured": true, 00:16:16.212 "data_offset": 2048, 00:16:16.212 "data_size": 63488 00:16:16.212 }, 00:16:16.212 { 00:16:16.212 "name": "pt3", 00:16:16.212 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:16.212 "is_configured": true, 00:16:16.212 "data_offset": 2048, 00:16:16.212 "data_size": 63488 00:16:16.212 } 00:16:16.212 ] 00:16:16.212 }' 00:16:16.212 05:44:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.212 05:44:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.780 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:16.780 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:16.780 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:16.780 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:16.780 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:16.780 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:16.780 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:16.780 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:17.039 [2024-07-26 05:44:31.731072] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:17.039 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:17.039 "name": "raid_bdev1", 00:16:17.039 "aliases": [ 00:16:17.039 "42914638-5683-4930-bd85-233a06262167" 00:16:17.039 ], 00:16:17.039 "product_name": "Raid Volume", 00:16:17.039 "block_size": 512, 00:16:17.039 "num_blocks": 190464, 00:16:17.039 "uuid": "42914638-5683-4930-bd85-233a06262167", 00:16:17.039 "assigned_rate_limits": { 00:16:17.039 "rw_ios_per_sec": 0, 00:16:17.039 "rw_mbytes_per_sec": 0, 00:16:17.039 "r_mbytes_per_sec": 0, 00:16:17.039 "w_mbytes_per_sec": 0 00:16:17.039 }, 00:16:17.039 "claimed": false, 00:16:17.039 "zoned": false, 00:16:17.039 "supported_io_types": { 00:16:17.039 "read": true, 00:16:17.039 "write": true, 00:16:17.039 "unmap": true, 00:16:17.039 "flush": true, 00:16:17.039 "reset": true, 00:16:17.039 "nvme_admin": false, 00:16:17.039 "nvme_io": false, 00:16:17.039 "nvme_io_md": false, 00:16:17.039 "write_zeroes": true, 00:16:17.039 "zcopy": false, 00:16:17.039 "get_zone_info": false, 00:16:17.039 "zone_management": false, 00:16:17.039 "zone_append": false, 00:16:17.039 "compare": false, 00:16:17.039 "compare_and_write": false, 00:16:17.039 "abort": false, 00:16:17.039 "seek_hole": false, 00:16:17.039 "seek_data": false, 00:16:17.039 "copy": false, 00:16:17.039 "nvme_iov_md": false 00:16:17.039 }, 00:16:17.039 "memory_domains": [ 00:16:17.039 { 00:16:17.039 "dma_device_id": "system", 00:16:17.039 "dma_device_type": 1 00:16:17.039 }, 00:16:17.039 { 00:16:17.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.039 "dma_device_type": 2 00:16:17.039 }, 00:16:17.039 { 00:16:17.039 "dma_device_id": "system", 00:16:17.039 "dma_device_type": 1 00:16:17.039 }, 00:16:17.039 { 00:16:17.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.039 "dma_device_type": 2 00:16:17.039 }, 00:16:17.039 { 00:16:17.039 "dma_device_id": "system", 00:16:17.039 "dma_device_type": 1 00:16:17.039 }, 00:16:17.039 { 00:16:17.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.039 "dma_device_type": 2 00:16:17.039 } 00:16:17.039 ], 00:16:17.039 "driver_specific": { 00:16:17.039 "raid": { 00:16:17.039 "uuid": "42914638-5683-4930-bd85-233a06262167", 00:16:17.039 "strip_size_kb": 64, 00:16:17.039 "state": "online", 00:16:17.039 "raid_level": "raid0", 00:16:17.039 "superblock": true, 00:16:17.039 "num_base_bdevs": 3, 00:16:17.039 "num_base_bdevs_discovered": 3, 00:16:17.039 "num_base_bdevs_operational": 3, 00:16:17.039 "base_bdevs_list": [ 00:16:17.039 { 00:16:17.039 "name": "pt1", 00:16:17.039 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:17.039 "is_configured": true, 00:16:17.039 "data_offset": 2048, 00:16:17.039 "data_size": 63488 00:16:17.039 }, 00:16:17.039 { 00:16:17.039 "name": "pt2", 00:16:17.039 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:17.039 "is_configured": true, 00:16:17.039 "data_offset": 2048, 00:16:17.039 "data_size": 63488 00:16:17.039 }, 00:16:17.039 { 00:16:17.039 "name": "pt3", 00:16:17.039 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:17.039 "is_configured": true, 00:16:17.039 "data_offset": 2048, 00:16:17.039 "data_size": 63488 00:16:17.039 } 00:16:17.039 ] 00:16:17.039 } 00:16:17.039 } 00:16:17.039 }' 00:16:17.039 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:17.039 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:17.039 pt2 00:16:17.039 pt3' 00:16:17.039 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.040 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:17.040 05:44:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.298 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.298 "name": "pt1", 00:16:17.298 "aliases": [ 00:16:17.298 "00000000-0000-0000-0000-000000000001" 00:16:17.298 ], 00:16:17.298 "product_name": "passthru", 00:16:17.298 "block_size": 512, 00:16:17.298 "num_blocks": 65536, 00:16:17.298 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:17.298 "assigned_rate_limits": { 00:16:17.298 "rw_ios_per_sec": 0, 00:16:17.298 "rw_mbytes_per_sec": 0, 00:16:17.298 "r_mbytes_per_sec": 0, 00:16:17.298 "w_mbytes_per_sec": 0 00:16:17.298 }, 00:16:17.298 "claimed": true, 00:16:17.298 "claim_type": "exclusive_write", 00:16:17.298 "zoned": false, 00:16:17.298 "supported_io_types": { 00:16:17.298 "read": true, 00:16:17.298 "write": true, 00:16:17.298 "unmap": true, 00:16:17.298 "flush": true, 00:16:17.298 "reset": true, 00:16:17.298 "nvme_admin": false, 00:16:17.298 "nvme_io": false, 00:16:17.298 "nvme_io_md": false, 00:16:17.298 "write_zeroes": true, 00:16:17.298 "zcopy": true, 00:16:17.298 "get_zone_info": false, 00:16:17.298 "zone_management": false, 00:16:17.298 "zone_append": false, 00:16:17.298 "compare": false, 00:16:17.298 "compare_and_write": false, 00:16:17.298 "abort": true, 00:16:17.298 "seek_hole": false, 00:16:17.298 "seek_data": false, 00:16:17.298 "copy": true, 00:16:17.298 "nvme_iov_md": false 00:16:17.298 }, 00:16:17.298 "memory_domains": [ 00:16:17.298 { 00:16:17.298 "dma_device_id": "system", 00:16:17.298 "dma_device_type": 1 00:16:17.298 }, 00:16:17.298 { 00:16:17.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.298 "dma_device_type": 2 00:16:17.298 } 00:16:17.298 ], 00:16:17.298 "driver_specific": { 00:16:17.298 "passthru": { 00:16:17.298 "name": "pt1", 00:16:17.298 "base_bdev_name": "malloc1" 00:16:17.298 } 00:16:17.298 } 00:16:17.298 }' 00:16:17.298 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.298 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.298 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.298 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.298 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.298 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.557 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.557 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.557 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.557 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.557 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.557 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:17.557 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.557 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:17.557 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.816 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.816 "name": "pt2", 00:16:17.816 "aliases": [ 00:16:17.816 "00000000-0000-0000-0000-000000000002" 00:16:17.816 ], 00:16:17.816 "product_name": "passthru", 00:16:17.816 "block_size": 512, 00:16:17.816 "num_blocks": 65536, 00:16:17.816 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:17.816 "assigned_rate_limits": { 00:16:17.816 "rw_ios_per_sec": 0, 00:16:17.816 "rw_mbytes_per_sec": 0, 00:16:17.816 "r_mbytes_per_sec": 0, 00:16:17.816 "w_mbytes_per_sec": 0 00:16:17.816 }, 00:16:17.816 "claimed": true, 00:16:17.816 "claim_type": "exclusive_write", 00:16:17.816 "zoned": false, 00:16:17.816 "supported_io_types": { 00:16:17.816 "read": true, 00:16:17.816 "write": true, 00:16:17.816 "unmap": true, 00:16:17.816 "flush": true, 00:16:17.816 "reset": true, 00:16:17.816 "nvme_admin": false, 00:16:17.816 "nvme_io": false, 00:16:17.816 "nvme_io_md": false, 00:16:17.816 "write_zeroes": true, 00:16:17.816 "zcopy": true, 00:16:17.816 "get_zone_info": false, 00:16:17.816 "zone_management": false, 00:16:17.816 "zone_append": false, 00:16:17.816 "compare": false, 00:16:17.816 "compare_and_write": false, 00:16:17.816 "abort": true, 00:16:17.816 "seek_hole": false, 00:16:17.816 "seek_data": false, 00:16:17.816 "copy": true, 00:16:17.816 "nvme_iov_md": false 00:16:17.816 }, 00:16:17.816 "memory_domains": [ 00:16:17.816 { 00:16:17.816 "dma_device_id": "system", 00:16:17.816 "dma_device_type": 1 00:16:17.816 }, 00:16:17.816 { 00:16:17.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.816 "dma_device_type": 2 00:16:17.816 } 00:16:17.816 ], 00:16:17.816 "driver_specific": { 00:16:17.816 "passthru": { 00:16:17.816 "name": "pt2", 00:16:17.816 "base_bdev_name": "malloc2" 00:16:17.816 } 00:16:17.816 } 00:16:17.816 }' 00:16:17.816 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.816 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.074 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.074 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.074 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.074 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.074 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.074 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.075 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.075 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.075 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.333 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.333 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:18.333 05:44:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:18.333 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.333 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.333 "name": "pt3", 00:16:18.333 "aliases": [ 00:16:18.333 "00000000-0000-0000-0000-000000000003" 00:16:18.333 ], 00:16:18.333 "product_name": "passthru", 00:16:18.333 "block_size": 512, 00:16:18.333 "num_blocks": 65536, 00:16:18.333 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:18.333 "assigned_rate_limits": { 00:16:18.333 "rw_ios_per_sec": 0, 00:16:18.333 "rw_mbytes_per_sec": 0, 00:16:18.333 "r_mbytes_per_sec": 0, 00:16:18.333 "w_mbytes_per_sec": 0 00:16:18.333 }, 00:16:18.333 "claimed": true, 00:16:18.333 "claim_type": "exclusive_write", 00:16:18.333 "zoned": false, 00:16:18.333 "supported_io_types": { 00:16:18.333 "read": true, 00:16:18.333 "write": true, 00:16:18.333 "unmap": true, 00:16:18.333 "flush": true, 00:16:18.333 "reset": true, 00:16:18.333 "nvme_admin": false, 00:16:18.333 "nvme_io": false, 00:16:18.333 "nvme_io_md": false, 00:16:18.333 "write_zeroes": true, 00:16:18.333 "zcopy": true, 00:16:18.333 "get_zone_info": false, 00:16:18.333 "zone_management": false, 00:16:18.333 "zone_append": false, 00:16:18.333 "compare": false, 00:16:18.333 "compare_and_write": false, 00:16:18.333 "abort": true, 00:16:18.333 "seek_hole": false, 00:16:18.333 "seek_data": false, 00:16:18.333 "copy": true, 00:16:18.333 "nvme_iov_md": false 00:16:18.333 }, 00:16:18.333 "memory_domains": [ 00:16:18.333 { 00:16:18.333 "dma_device_id": "system", 00:16:18.333 "dma_device_type": 1 00:16:18.333 }, 00:16:18.333 { 00:16:18.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.333 "dma_device_type": 2 00:16:18.333 } 00:16:18.333 ], 00:16:18.333 "driver_specific": { 00:16:18.333 "passthru": { 00:16:18.333 "name": "pt3", 00:16:18.333 "base_bdev_name": "malloc3" 00:16:18.333 } 00:16:18.333 } 00:16:18.333 }' 00:16:18.333 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.333 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.596 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.596 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.596 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.596 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.596 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.596 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.858 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.858 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.858 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.858 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.858 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:18.858 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:19.117 [2024-07-26 05:44:33.824604] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:19.117 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=42914638-5683-4930-bd85-233a06262167 00:16:19.117 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 42914638-5683-4930-bd85-233a06262167 ']' 00:16:19.117 05:44:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:19.375 [2024-07-26 05:44:34.060962] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:19.375 [2024-07-26 05:44:34.060984] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:19.375 [2024-07-26 05:44:34.061031] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:19.375 [2024-07-26 05:44:34.061082] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:19.375 [2024-07-26 05:44:34.061093] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1725ea0 name raid_bdev1, state offline 00:16:19.375 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.375 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:19.633 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:19.633 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:19.633 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:19.633 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:19.891 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:19.891 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:19.891 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:19.891 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:20.150 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:20.150 05:44:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:20.716 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:20.975 [2024-07-26 05:44:35.649114] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:20.975 [2024-07-26 05:44:35.650465] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:20.975 [2024-07-26 05:44:35.650506] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:20.975 [2024-07-26 05:44:35.650550] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:20.975 [2024-07-26 05:44:35.650587] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:20.975 [2024-07-26 05:44:35.650610] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:20.975 [2024-07-26 05:44:35.650628] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:20.975 [2024-07-26 05:44:35.650647] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18d0ff0 name raid_bdev1, state configuring 00:16:20.975 request: 00:16:20.975 { 00:16:20.975 "name": "raid_bdev1", 00:16:20.975 "raid_level": "raid0", 00:16:20.975 "base_bdevs": [ 00:16:20.975 "malloc1", 00:16:20.975 "malloc2", 00:16:20.975 "malloc3" 00:16:20.975 ], 00:16:20.975 "strip_size_kb": 64, 00:16:20.975 "superblock": false, 00:16:20.975 "method": "bdev_raid_create", 00:16:20.975 "req_id": 1 00:16:20.975 } 00:16:20.975 Got JSON-RPC error response 00:16:20.975 response: 00:16:20.975 { 00:16:20.975 "code": -17, 00:16:20.975 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:20.975 } 00:16:20.975 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:20.975 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:20.975 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:20.975 05:44:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:20.975 05:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.975 05:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:20.975 05:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:20.975 05:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:20.975 05:44:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:21.234 [2024-07-26 05:44:36.054135] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:21.234 [2024-07-26 05:44:36.054181] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.234 [2024-07-26 05:44:36.054202] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x172d7a0 00:16:21.234 [2024-07-26 05:44:36.054215] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.234 [2024-07-26 05:44:36.055841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.234 [2024-07-26 05:44:36.055870] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:21.234 [2024-07-26 05:44:36.055933] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:21.234 [2024-07-26 05:44:36.055958] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:21.234 pt1 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.234 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:21.492 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.492 "name": "raid_bdev1", 00:16:21.492 "uuid": "42914638-5683-4930-bd85-233a06262167", 00:16:21.492 "strip_size_kb": 64, 00:16:21.492 "state": "configuring", 00:16:21.492 "raid_level": "raid0", 00:16:21.492 "superblock": true, 00:16:21.492 "num_base_bdevs": 3, 00:16:21.492 "num_base_bdevs_discovered": 1, 00:16:21.492 "num_base_bdevs_operational": 3, 00:16:21.492 "base_bdevs_list": [ 00:16:21.492 { 00:16:21.492 "name": "pt1", 00:16:21.492 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:21.492 "is_configured": true, 00:16:21.492 "data_offset": 2048, 00:16:21.492 "data_size": 63488 00:16:21.492 }, 00:16:21.492 { 00:16:21.492 "name": null, 00:16:21.492 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:21.492 "is_configured": false, 00:16:21.492 "data_offset": 2048, 00:16:21.492 "data_size": 63488 00:16:21.492 }, 00:16:21.492 { 00:16:21.492 "name": null, 00:16:21.492 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:21.492 "is_configured": false, 00:16:21.492 "data_offset": 2048, 00:16:21.492 "data_size": 63488 00:16:21.492 } 00:16:21.492 ] 00:16:21.492 }' 00:16:21.492 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.492 05:44:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.058 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:16:22.058 05:44:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:22.316 [2024-07-26 05:44:37.064811] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:22.316 [2024-07-26 05:44:37.064854] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:22.316 [2024-07-26 05:44:37.064872] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1724c70 00:16:22.316 [2024-07-26 05:44:37.064884] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:22.316 [2024-07-26 05:44:37.065217] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:22.316 [2024-07-26 05:44:37.065234] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:22.316 [2024-07-26 05:44:37.065298] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:22.316 [2024-07-26 05:44:37.065317] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:22.316 pt2 00:16:22.316 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:22.575 [2024-07-26 05:44:37.309470] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.575 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:22.833 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.833 "name": "raid_bdev1", 00:16:22.833 "uuid": "42914638-5683-4930-bd85-233a06262167", 00:16:22.833 "strip_size_kb": 64, 00:16:22.833 "state": "configuring", 00:16:22.833 "raid_level": "raid0", 00:16:22.833 "superblock": true, 00:16:22.833 "num_base_bdevs": 3, 00:16:22.833 "num_base_bdevs_discovered": 1, 00:16:22.833 "num_base_bdevs_operational": 3, 00:16:22.833 "base_bdevs_list": [ 00:16:22.833 { 00:16:22.833 "name": "pt1", 00:16:22.833 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:22.833 "is_configured": true, 00:16:22.833 "data_offset": 2048, 00:16:22.833 "data_size": 63488 00:16:22.833 }, 00:16:22.833 { 00:16:22.833 "name": null, 00:16:22.833 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:22.833 "is_configured": false, 00:16:22.833 "data_offset": 2048, 00:16:22.833 "data_size": 63488 00:16:22.833 }, 00:16:22.833 { 00:16:22.833 "name": null, 00:16:22.833 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:22.833 "is_configured": false, 00:16:22.833 "data_offset": 2048, 00:16:22.833 "data_size": 63488 00:16:22.833 } 00:16:22.833 ] 00:16:22.833 }' 00:16:22.833 05:44:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.833 05:44:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.438 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:23.438 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:23.438 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:23.696 [2024-07-26 05:44:38.400503] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:23.696 [2024-07-26 05:44:38.400551] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:23.696 [2024-07-26 05:44:38.400570] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18c5fa0 00:16:23.696 [2024-07-26 05:44:38.400584] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:23.696 [2024-07-26 05:44:38.400927] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:23.696 [2024-07-26 05:44:38.400945] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:23.696 [2024-07-26 05:44:38.401006] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:23.696 [2024-07-26 05:44:38.401024] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:23.696 pt2 00:16:23.696 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:23.696 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:23.696 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:23.954 [2024-07-26 05:44:38.649159] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:23.954 [2024-07-26 05:44:38.649193] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:23.954 [2024-07-26 05:44:38.649210] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18c6b30 00:16:23.954 [2024-07-26 05:44:38.649222] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:23.954 [2024-07-26 05:44:38.649516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:23.954 [2024-07-26 05:44:38.649532] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:23.954 [2024-07-26 05:44:38.649583] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:23.954 [2024-07-26 05:44:38.649601] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:23.954 [2024-07-26 05:44:38.649708] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x18c7c00 00:16:23.954 [2024-07-26 05:44:38.649720] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:23.954 [2024-07-26 05:44:38.649883] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18d09b0 00:16:23.954 [2024-07-26 05:44:38.650002] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18c7c00 00:16:23.954 [2024-07-26 05:44:38.650011] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18c7c00 00:16:23.954 [2024-07-26 05:44:38.650104] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:23.954 pt3 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.954 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:24.213 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.213 "name": "raid_bdev1", 00:16:24.213 "uuid": "42914638-5683-4930-bd85-233a06262167", 00:16:24.213 "strip_size_kb": 64, 00:16:24.213 "state": "online", 00:16:24.213 "raid_level": "raid0", 00:16:24.213 "superblock": true, 00:16:24.213 "num_base_bdevs": 3, 00:16:24.213 "num_base_bdevs_discovered": 3, 00:16:24.213 "num_base_bdevs_operational": 3, 00:16:24.213 "base_bdevs_list": [ 00:16:24.213 { 00:16:24.213 "name": "pt1", 00:16:24.213 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:24.213 "is_configured": true, 00:16:24.213 "data_offset": 2048, 00:16:24.213 "data_size": 63488 00:16:24.213 }, 00:16:24.213 { 00:16:24.213 "name": "pt2", 00:16:24.213 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:24.213 "is_configured": true, 00:16:24.213 "data_offset": 2048, 00:16:24.213 "data_size": 63488 00:16:24.213 }, 00:16:24.213 { 00:16:24.213 "name": "pt3", 00:16:24.213 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:24.213 "is_configured": true, 00:16:24.213 "data_offset": 2048, 00:16:24.213 "data_size": 63488 00:16:24.213 } 00:16:24.213 ] 00:16:24.213 }' 00:16:24.213 05:44:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.213 05:44:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.778 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:24.778 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:24.778 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:24.778 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:24.778 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:24.778 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:24.778 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:24.778 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:25.036 [2024-07-26 05:44:39.700223] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:25.036 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:25.036 "name": "raid_bdev1", 00:16:25.036 "aliases": [ 00:16:25.036 "42914638-5683-4930-bd85-233a06262167" 00:16:25.036 ], 00:16:25.036 "product_name": "Raid Volume", 00:16:25.036 "block_size": 512, 00:16:25.036 "num_blocks": 190464, 00:16:25.036 "uuid": "42914638-5683-4930-bd85-233a06262167", 00:16:25.036 "assigned_rate_limits": { 00:16:25.036 "rw_ios_per_sec": 0, 00:16:25.036 "rw_mbytes_per_sec": 0, 00:16:25.036 "r_mbytes_per_sec": 0, 00:16:25.036 "w_mbytes_per_sec": 0 00:16:25.036 }, 00:16:25.036 "claimed": false, 00:16:25.036 "zoned": false, 00:16:25.036 "supported_io_types": { 00:16:25.037 "read": true, 00:16:25.037 "write": true, 00:16:25.037 "unmap": true, 00:16:25.037 "flush": true, 00:16:25.037 "reset": true, 00:16:25.037 "nvme_admin": false, 00:16:25.037 "nvme_io": false, 00:16:25.037 "nvme_io_md": false, 00:16:25.037 "write_zeroes": true, 00:16:25.037 "zcopy": false, 00:16:25.037 "get_zone_info": false, 00:16:25.037 "zone_management": false, 00:16:25.037 "zone_append": false, 00:16:25.037 "compare": false, 00:16:25.037 "compare_and_write": false, 00:16:25.037 "abort": false, 00:16:25.037 "seek_hole": false, 00:16:25.037 "seek_data": false, 00:16:25.037 "copy": false, 00:16:25.037 "nvme_iov_md": false 00:16:25.037 }, 00:16:25.037 "memory_domains": [ 00:16:25.037 { 00:16:25.037 "dma_device_id": "system", 00:16:25.037 "dma_device_type": 1 00:16:25.037 }, 00:16:25.037 { 00:16:25.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.037 "dma_device_type": 2 00:16:25.037 }, 00:16:25.037 { 00:16:25.037 "dma_device_id": "system", 00:16:25.037 "dma_device_type": 1 00:16:25.037 }, 00:16:25.037 { 00:16:25.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.037 "dma_device_type": 2 00:16:25.037 }, 00:16:25.037 { 00:16:25.037 "dma_device_id": "system", 00:16:25.037 "dma_device_type": 1 00:16:25.037 }, 00:16:25.037 { 00:16:25.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.037 "dma_device_type": 2 00:16:25.037 } 00:16:25.037 ], 00:16:25.037 "driver_specific": { 00:16:25.037 "raid": { 00:16:25.037 "uuid": "42914638-5683-4930-bd85-233a06262167", 00:16:25.037 "strip_size_kb": 64, 00:16:25.037 "state": "online", 00:16:25.037 "raid_level": "raid0", 00:16:25.037 "superblock": true, 00:16:25.037 "num_base_bdevs": 3, 00:16:25.037 "num_base_bdevs_discovered": 3, 00:16:25.037 "num_base_bdevs_operational": 3, 00:16:25.037 "base_bdevs_list": [ 00:16:25.037 { 00:16:25.037 "name": "pt1", 00:16:25.037 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:25.037 "is_configured": true, 00:16:25.037 "data_offset": 2048, 00:16:25.037 "data_size": 63488 00:16:25.037 }, 00:16:25.037 { 00:16:25.037 "name": "pt2", 00:16:25.037 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:25.037 "is_configured": true, 00:16:25.037 "data_offset": 2048, 00:16:25.037 "data_size": 63488 00:16:25.037 }, 00:16:25.037 { 00:16:25.037 "name": "pt3", 00:16:25.037 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:25.037 "is_configured": true, 00:16:25.037 "data_offset": 2048, 00:16:25.037 "data_size": 63488 00:16:25.037 } 00:16:25.037 ] 00:16:25.037 } 00:16:25.037 } 00:16:25.037 }' 00:16:25.037 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:25.037 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:25.037 pt2 00:16:25.037 pt3' 00:16:25.037 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.037 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:25.037 05:44:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.296 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.296 "name": "pt1", 00:16:25.296 "aliases": [ 00:16:25.296 "00000000-0000-0000-0000-000000000001" 00:16:25.296 ], 00:16:25.296 "product_name": "passthru", 00:16:25.296 "block_size": 512, 00:16:25.296 "num_blocks": 65536, 00:16:25.296 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:25.296 "assigned_rate_limits": { 00:16:25.296 "rw_ios_per_sec": 0, 00:16:25.296 "rw_mbytes_per_sec": 0, 00:16:25.296 "r_mbytes_per_sec": 0, 00:16:25.296 "w_mbytes_per_sec": 0 00:16:25.296 }, 00:16:25.296 "claimed": true, 00:16:25.296 "claim_type": "exclusive_write", 00:16:25.296 "zoned": false, 00:16:25.296 "supported_io_types": { 00:16:25.296 "read": true, 00:16:25.296 "write": true, 00:16:25.296 "unmap": true, 00:16:25.296 "flush": true, 00:16:25.296 "reset": true, 00:16:25.296 "nvme_admin": false, 00:16:25.296 "nvme_io": false, 00:16:25.296 "nvme_io_md": false, 00:16:25.296 "write_zeroes": true, 00:16:25.296 "zcopy": true, 00:16:25.296 "get_zone_info": false, 00:16:25.296 "zone_management": false, 00:16:25.296 "zone_append": false, 00:16:25.296 "compare": false, 00:16:25.296 "compare_and_write": false, 00:16:25.296 "abort": true, 00:16:25.296 "seek_hole": false, 00:16:25.296 "seek_data": false, 00:16:25.296 "copy": true, 00:16:25.296 "nvme_iov_md": false 00:16:25.296 }, 00:16:25.296 "memory_domains": [ 00:16:25.296 { 00:16:25.296 "dma_device_id": "system", 00:16:25.296 "dma_device_type": 1 00:16:25.296 }, 00:16:25.296 { 00:16:25.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.296 "dma_device_type": 2 00:16:25.296 } 00:16:25.296 ], 00:16:25.296 "driver_specific": { 00:16:25.296 "passthru": { 00:16:25.296 "name": "pt1", 00:16:25.296 "base_bdev_name": "malloc1" 00:16:25.296 } 00:16:25.296 } 00:16:25.296 }' 00:16:25.296 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.296 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.296 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.296 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.296 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.296 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.296 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.555 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.555 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.555 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.555 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.555 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.555 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.555 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:25.555 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.813 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.813 "name": "pt2", 00:16:25.813 "aliases": [ 00:16:25.813 "00000000-0000-0000-0000-000000000002" 00:16:25.813 ], 00:16:25.813 "product_name": "passthru", 00:16:25.813 "block_size": 512, 00:16:25.813 "num_blocks": 65536, 00:16:25.813 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:25.813 "assigned_rate_limits": { 00:16:25.813 "rw_ios_per_sec": 0, 00:16:25.813 "rw_mbytes_per_sec": 0, 00:16:25.813 "r_mbytes_per_sec": 0, 00:16:25.813 "w_mbytes_per_sec": 0 00:16:25.813 }, 00:16:25.813 "claimed": true, 00:16:25.813 "claim_type": "exclusive_write", 00:16:25.813 "zoned": false, 00:16:25.813 "supported_io_types": { 00:16:25.813 "read": true, 00:16:25.813 "write": true, 00:16:25.813 "unmap": true, 00:16:25.813 "flush": true, 00:16:25.813 "reset": true, 00:16:25.813 "nvme_admin": false, 00:16:25.813 "nvme_io": false, 00:16:25.813 "nvme_io_md": false, 00:16:25.813 "write_zeroes": true, 00:16:25.813 "zcopy": true, 00:16:25.813 "get_zone_info": false, 00:16:25.813 "zone_management": false, 00:16:25.813 "zone_append": false, 00:16:25.813 "compare": false, 00:16:25.813 "compare_and_write": false, 00:16:25.813 "abort": true, 00:16:25.813 "seek_hole": false, 00:16:25.813 "seek_data": false, 00:16:25.813 "copy": true, 00:16:25.813 "nvme_iov_md": false 00:16:25.813 }, 00:16:25.813 "memory_domains": [ 00:16:25.813 { 00:16:25.813 "dma_device_id": "system", 00:16:25.813 "dma_device_type": 1 00:16:25.813 }, 00:16:25.813 { 00:16:25.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.813 "dma_device_type": 2 00:16:25.813 } 00:16:25.813 ], 00:16:25.813 "driver_specific": { 00:16:25.813 "passthru": { 00:16:25.813 "name": "pt2", 00:16:25.813 "base_bdev_name": "malloc2" 00:16:25.813 } 00:16:25.813 } 00:16:25.813 }' 00:16:25.813 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.813 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.813 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.813 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.813 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.813 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.813 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.813 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.072 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.072 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.072 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.072 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.072 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:26.072 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:26.072 05:44:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:26.331 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:26.331 "name": "pt3", 00:16:26.331 "aliases": [ 00:16:26.331 "00000000-0000-0000-0000-000000000003" 00:16:26.331 ], 00:16:26.331 "product_name": "passthru", 00:16:26.331 "block_size": 512, 00:16:26.331 "num_blocks": 65536, 00:16:26.331 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:26.331 "assigned_rate_limits": { 00:16:26.331 "rw_ios_per_sec": 0, 00:16:26.331 "rw_mbytes_per_sec": 0, 00:16:26.331 "r_mbytes_per_sec": 0, 00:16:26.331 "w_mbytes_per_sec": 0 00:16:26.331 }, 00:16:26.331 "claimed": true, 00:16:26.331 "claim_type": "exclusive_write", 00:16:26.331 "zoned": false, 00:16:26.331 "supported_io_types": { 00:16:26.331 "read": true, 00:16:26.331 "write": true, 00:16:26.331 "unmap": true, 00:16:26.331 "flush": true, 00:16:26.331 "reset": true, 00:16:26.331 "nvme_admin": false, 00:16:26.331 "nvme_io": false, 00:16:26.331 "nvme_io_md": false, 00:16:26.331 "write_zeroes": true, 00:16:26.331 "zcopy": true, 00:16:26.331 "get_zone_info": false, 00:16:26.331 "zone_management": false, 00:16:26.331 "zone_append": false, 00:16:26.331 "compare": false, 00:16:26.331 "compare_and_write": false, 00:16:26.331 "abort": true, 00:16:26.331 "seek_hole": false, 00:16:26.331 "seek_data": false, 00:16:26.331 "copy": true, 00:16:26.331 "nvme_iov_md": false 00:16:26.331 }, 00:16:26.331 "memory_domains": [ 00:16:26.331 { 00:16:26.331 "dma_device_id": "system", 00:16:26.331 "dma_device_type": 1 00:16:26.331 }, 00:16:26.331 { 00:16:26.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.331 "dma_device_type": 2 00:16:26.331 } 00:16:26.331 ], 00:16:26.331 "driver_specific": { 00:16:26.331 "passthru": { 00:16:26.331 "name": "pt3", 00:16:26.331 "base_bdev_name": "malloc3" 00:16:26.331 } 00:16:26.331 } 00:16:26.331 }' 00:16:26.331 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.331 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.331 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:26.331 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.331 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.588 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:26.588 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.588 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.588 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.588 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.588 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.847 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.847 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:26.847 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:26.847 [2024-07-26 05:44:41.737626] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 42914638-5683-4930-bd85-233a06262167 '!=' 42914638-5683-4930-bd85-233a06262167 ']' 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1156244 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1156244 ']' 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1156244 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1156244 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1156244' 00:16:27.106 killing process with pid 1156244 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1156244 00:16:27.106 [2024-07-26 05:44:41.798044] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:27.106 [2024-07-26 05:44:41.798093] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:27.106 [2024-07-26 05:44:41.798144] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:27.106 [2024-07-26 05:44:41.798155] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18c7c00 name raid_bdev1, state offline 00:16:27.106 05:44:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1156244 00:16:27.106 [2024-07-26 05:44:41.824474] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:27.365 05:44:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:27.365 00:16:27.365 real 0m14.071s 00:16:27.365 user 0m25.471s 00:16:27.365 sys 0m2.502s 00:16:27.365 05:44:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:27.365 05:44:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.365 ************************************ 00:16:27.365 END TEST raid_superblock_test 00:16:27.365 ************************************ 00:16:27.365 05:44:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:27.365 05:44:42 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:16:27.365 05:44:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:27.365 05:44:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:27.365 05:44:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:27.365 ************************************ 00:16:27.365 START TEST raid_read_error_test 00:16:27.365 ************************************ 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.lUoutJQKoH 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1158330 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1158330 /var/tmp/spdk-raid.sock 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1158330 ']' 00:16:27.365 05:44:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:27.366 05:44:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:27.366 05:44:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:27.366 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:27.366 05:44:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:27.366 05:44:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.366 [2024-07-26 05:44:42.169691] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:16:27.366 [2024-07-26 05:44:42.169756] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1158330 ] 00:16:27.625 [2024-07-26 05:44:42.292507] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:27.625 [2024-07-26 05:44:42.397601] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:27.625 [2024-07-26 05:44:42.465029] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:27.625 [2024-07-26 05:44:42.465073] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.191 05:44:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:28.191 05:44:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:28.191 05:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:28.191 05:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:28.450 BaseBdev1_malloc 00:16:28.450 05:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:28.709 true 00:16:28.709 05:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:28.967 [2024-07-26 05:44:43.643595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:28.967 [2024-07-26 05:44:43.643645] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:28.967 [2024-07-26 05:44:43.643669] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb00d0 00:16:28.967 [2024-07-26 05:44:43.643682] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:28.967 [2024-07-26 05:44:43.645516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:28.967 [2024-07-26 05:44:43.645546] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:28.967 BaseBdev1 00:16:28.967 05:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:28.967 05:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:29.226 BaseBdev2_malloc 00:16:29.226 05:44:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:29.226 true 00:16:29.226 05:44:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:29.485 [2024-07-26 05:44:44.297849] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:29.485 [2024-07-26 05:44:44.297898] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:29.485 [2024-07-26 05:44:44.297922] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb4910 00:16:29.485 [2024-07-26 05:44:44.297935] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:29.485 [2024-07-26 05:44:44.299564] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:29.485 [2024-07-26 05:44:44.299593] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:29.485 BaseBdev2 00:16:29.485 05:44:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:29.485 05:44:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:30.052 BaseBdev3_malloc 00:16:30.052 05:44:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:30.311 true 00:16:30.311 05:44:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:30.311 [2024-07-26 05:44:45.205889] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:30.311 [2024-07-26 05:44:45.205935] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:30.311 [2024-07-26 05:44:45.205957] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb6bd0 00:16:30.311 [2024-07-26 05:44:45.205970] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:30.311 [2024-07-26 05:44:45.207520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:30.311 [2024-07-26 05:44:45.207549] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:30.311 BaseBdev3 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:30.570 [2024-07-26 05:44:45.370363] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:30.570 [2024-07-26 05:44:45.371694] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:30.570 [2024-07-26 05:44:45.371766] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:30.570 [2024-07-26 05:44:45.371976] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xfb8280 00:16:30.570 [2024-07-26 05:44:45.371989] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:30.570 [2024-07-26 05:44:45.372183] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb7e20 00:16:30.570 [2024-07-26 05:44:45.372330] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfb8280 00:16:30.570 [2024-07-26 05:44:45.372339] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfb8280 00:16:30.570 [2024-07-26 05:44:45.372441] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.570 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:31.137 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.137 "name": "raid_bdev1", 00:16:31.137 "uuid": "08ee223c-afcf-43f7-9a45-45264664d115", 00:16:31.137 "strip_size_kb": 64, 00:16:31.137 "state": "online", 00:16:31.137 "raid_level": "raid0", 00:16:31.137 "superblock": true, 00:16:31.137 "num_base_bdevs": 3, 00:16:31.137 "num_base_bdevs_discovered": 3, 00:16:31.137 "num_base_bdevs_operational": 3, 00:16:31.137 "base_bdevs_list": [ 00:16:31.137 { 00:16:31.137 "name": "BaseBdev1", 00:16:31.137 "uuid": "832123c1-e2ab-5f38-b2fd-842069616b28", 00:16:31.137 "is_configured": true, 00:16:31.137 "data_offset": 2048, 00:16:31.137 "data_size": 63488 00:16:31.137 }, 00:16:31.137 { 00:16:31.137 "name": "BaseBdev2", 00:16:31.137 "uuid": "d453c8f0-4ef8-5403-a47a-f0bd136686f7", 00:16:31.137 "is_configured": true, 00:16:31.137 "data_offset": 2048, 00:16:31.137 "data_size": 63488 00:16:31.137 }, 00:16:31.137 { 00:16:31.137 "name": "BaseBdev3", 00:16:31.137 "uuid": "1e47b493-495b-56e8-8366-cb2fd17154e6", 00:16:31.137 "is_configured": true, 00:16:31.137 "data_offset": 2048, 00:16:31.137 "data_size": 63488 00:16:31.137 } 00:16:31.137 ] 00:16:31.137 }' 00:16:31.137 05:44:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.137 05:44:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.704 05:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:31.704 05:44:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:31.704 [2024-07-26 05:44:46.565809] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe065b0 00:16:32.640 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:32.898 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.156 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.156 "name": "raid_bdev1", 00:16:33.156 "uuid": "08ee223c-afcf-43f7-9a45-45264664d115", 00:16:33.156 "strip_size_kb": 64, 00:16:33.156 "state": "online", 00:16:33.156 "raid_level": "raid0", 00:16:33.156 "superblock": true, 00:16:33.156 "num_base_bdevs": 3, 00:16:33.156 "num_base_bdevs_discovered": 3, 00:16:33.156 "num_base_bdevs_operational": 3, 00:16:33.156 "base_bdevs_list": [ 00:16:33.156 { 00:16:33.156 "name": "BaseBdev1", 00:16:33.156 "uuid": "832123c1-e2ab-5f38-b2fd-842069616b28", 00:16:33.156 "is_configured": true, 00:16:33.156 "data_offset": 2048, 00:16:33.156 "data_size": 63488 00:16:33.156 }, 00:16:33.156 { 00:16:33.156 "name": "BaseBdev2", 00:16:33.156 "uuid": "d453c8f0-4ef8-5403-a47a-f0bd136686f7", 00:16:33.156 "is_configured": true, 00:16:33.156 "data_offset": 2048, 00:16:33.156 "data_size": 63488 00:16:33.156 }, 00:16:33.156 { 00:16:33.156 "name": "BaseBdev3", 00:16:33.156 "uuid": "1e47b493-495b-56e8-8366-cb2fd17154e6", 00:16:33.156 "is_configured": true, 00:16:33.156 "data_offset": 2048, 00:16:33.156 "data_size": 63488 00:16:33.156 } 00:16:33.156 ] 00:16:33.156 }' 00:16:33.156 05:44:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.156 05:44:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:33.724 05:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:33.983 [2024-07-26 05:44:48.734107] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:33.983 [2024-07-26 05:44:48.734148] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:33.983 [2024-07-26 05:44:48.737297] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:33.983 [2024-07-26 05:44:48.737334] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:33.983 [2024-07-26 05:44:48.737368] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:33.983 [2024-07-26 05:44:48.737379] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfb8280 name raid_bdev1, state offline 00:16:33.983 0 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1158330 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1158330 ']' 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1158330 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1158330 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1158330' 00:16:33.983 killing process with pid 1158330 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1158330 00:16:33.983 [2024-07-26 05:44:48.801528] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:33.983 05:44:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1158330 00:16:33.983 [2024-07-26 05:44:48.822352] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.lUoutJQKoH 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:16:34.243 00:16:34.243 real 0m6.966s 00:16:34.243 user 0m11.028s 00:16:34.243 sys 0m1.210s 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:34.243 05:44:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.243 ************************************ 00:16:34.243 END TEST raid_read_error_test 00:16:34.243 ************************************ 00:16:34.243 05:44:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:34.243 05:44:49 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:16:34.243 05:44:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:34.243 05:44:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:34.243 05:44:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:34.243 ************************************ 00:16:34.243 START TEST raid_write_error_test 00:16:34.243 ************************************ 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Tmw8PqXUcx 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1159310 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1159310 /var/tmp/spdk-raid.sock 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1159310 ']' 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:34.243 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.243 05:44:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:34.503 [2024-07-26 05:44:49.201515] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:16:34.503 [2024-07-26 05:44:49.201586] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1159310 ] 00:16:34.503 [2024-07-26 05:44:49.333722] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:34.762 [2024-07-26 05:44:49.444707] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:34.762 [2024-07-26 05:44:49.510896] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:34.762 [2024-07-26 05:44:49.510925] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:35.330 05:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:35.330 05:44:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:35.330 05:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:35.330 05:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:35.589 BaseBdev1_malloc 00:16:35.589 05:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:35.589 true 00:16:35.589 05:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:35.848 [2024-07-26 05:44:50.633221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:35.848 [2024-07-26 05:44:50.633267] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:35.848 [2024-07-26 05:44:50.633289] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210a0d0 00:16:35.848 [2024-07-26 05:44:50.633302] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:35.848 [2024-07-26 05:44:50.635188] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:35.848 [2024-07-26 05:44:50.635217] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:35.848 BaseBdev1 00:16:35.848 05:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:35.848 05:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:36.107 BaseBdev2_malloc 00:16:36.107 05:44:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:36.366 true 00:16:36.366 05:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:36.625 [2024-07-26 05:44:51.332898] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:36.625 [2024-07-26 05:44:51.332944] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:36.625 [2024-07-26 05:44:51.332966] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210e910 00:16:36.625 [2024-07-26 05:44:51.332979] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:36.625 [2024-07-26 05:44:51.334540] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:36.625 [2024-07-26 05:44:51.334568] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:36.625 BaseBdev2 00:16:36.625 05:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:36.625 05:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:37.193 BaseBdev3_malloc 00:16:37.194 05:44:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:37.194 true 00:16:37.452 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:37.452 [2024-07-26 05:44:52.313262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:37.452 [2024-07-26 05:44:52.313309] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:37.452 [2024-07-26 05:44:52.313330] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2110bd0 00:16:37.452 [2024-07-26 05:44:52.313343] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:37.452 [2024-07-26 05:44:52.314933] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:37.453 [2024-07-26 05:44:52.314964] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:37.453 BaseBdev3 00:16:37.453 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:38.020 [2024-07-26 05:44:52.810594] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:38.020 [2024-07-26 05:44:52.811931] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:38.020 [2024-07-26 05:44:52.812000] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:38.020 [2024-07-26 05:44:52.812206] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2112280 00:16:38.020 [2024-07-26 05:44:52.812218] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:38.020 [2024-07-26 05:44:52.812419] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2111e20 00:16:38.020 [2024-07-26 05:44:52.812565] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2112280 00:16:38.020 [2024-07-26 05:44:52.812580] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2112280 00:16:38.020 [2024-07-26 05:44:52.812702] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.020 05:44:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:38.279 05:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:38.279 "name": "raid_bdev1", 00:16:38.279 "uuid": "837687ce-d39f-4996-836b-50290b66f66a", 00:16:38.279 "strip_size_kb": 64, 00:16:38.279 "state": "online", 00:16:38.279 "raid_level": "raid0", 00:16:38.279 "superblock": true, 00:16:38.279 "num_base_bdevs": 3, 00:16:38.279 "num_base_bdevs_discovered": 3, 00:16:38.279 "num_base_bdevs_operational": 3, 00:16:38.279 "base_bdevs_list": [ 00:16:38.279 { 00:16:38.279 "name": "BaseBdev1", 00:16:38.279 "uuid": "7dd16ea9-16a5-571a-9233-a840c7f5965e", 00:16:38.279 "is_configured": true, 00:16:38.279 "data_offset": 2048, 00:16:38.279 "data_size": 63488 00:16:38.279 }, 00:16:38.279 { 00:16:38.279 "name": "BaseBdev2", 00:16:38.279 "uuid": "d4ca90d2-64fa-5428-9b0d-3e1a665507ee", 00:16:38.279 "is_configured": true, 00:16:38.279 "data_offset": 2048, 00:16:38.279 "data_size": 63488 00:16:38.279 }, 00:16:38.279 { 00:16:38.279 "name": "BaseBdev3", 00:16:38.279 "uuid": "02eab942-1ea1-5dab-ab1f-77dc99b49478", 00:16:38.279 "is_configured": true, 00:16:38.279 "data_offset": 2048, 00:16:38.279 "data_size": 63488 00:16:38.279 } 00:16:38.279 ] 00:16:38.279 }' 00:16:38.279 05:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:38.279 05:44:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:39.216 05:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:39.216 05:44:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:39.216 [2024-07-26 05:44:54.062190] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f605b0 00:16:40.152 05:44:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.412 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:40.671 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.671 "name": "raid_bdev1", 00:16:40.671 "uuid": "837687ce-d39f-4996-836b-50290b66f66a", 00:16:40.671 "strip_size_kb": 64, 00:16:40.671 "state": "online", 00:16:40.671 "raid_level": "raid0", 00:16:40.671 "superblock": true, 00:16:40.671 "num_base_bdevs": 3, 00:16:40.671 "num_base_bdevs_discovered": 3, 00:16:40.671 "num_base_bdevs_operational": 3, 00:16:40.671 "base_bdevs_list": [ 00:16:40.671 { 00:16:40.671 "name": "BaseBdev1", 00:16:40.671 "uuid": "7dd16ea9-16a5-571a-9233-a840c7f5965e", 00:16:40.671 "is_configured": true, 00:16:40.671 "data_offset": 2048, 00:16:40.671 "data_size": 63488 00:16:40.671 }, 00:16:40.671 { 00:16:40.671 "name": "BaseBdev2", 00:16:40.671 "uuid": "d4ca90d2-64fa-5428-9b0d-3e1a665507ee", 00:16:40.671 "is_configured": true, 00:16:40.671 "data_offset": 2048, 00:16:40.671 "data_size": 63488 00:16:40.671 }, 00:16:40.671 { 00:16:40.671 "name": "BaseBdev3", 00:16:40.671 "uuid": "02eab942-1ea1-5dab-ab1f-77dc99b49478", 00:16:40.671 "is_configured": true, 00:16:40.671 "data_offset": 2048, 00:16:40.671 "data_size": 63488 00:16:40.671 } 00:16:40.671 ] 00:16:40.671 }' 00:16:40.671 05:44:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.671 05:44:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.240 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:41.499 [2024-07-26 05:44:56.291613] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:41.499 [2024-07-26 05:44:56.291654] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:41.499 [2024-07-26 05:44:56.294864] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:41.499 [2024-07-26 05:44:56.294900] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:41.499 [2024-07-26 05:44:56.294936] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:41.499 [2024-07-26 05:44:56.294947] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2112280 name raid_bdev1, state offline 00:16:41.499 0 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1159310 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1159310 ']' 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1159310 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1159310 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1159310' 00:16:41.499 killing process with pid 1159310 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1159310 00:16:41.499 [2024-07-26 05:44:56.361727] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:41.499 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1159310 00:16:41.499 [2024-07-26 05:44:56.382707] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Tmw8PqXUcx 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:16:41.769 00:16:41.769 real 0m7.495s 00:16:41.769 user 0m12.035s 00:16:41.769 sys 0m1.269s 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:41.769 05:44:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.770 ************************************ 00:16:41.770 END TEST raid_write_error_test 00:16:41.770 ************************************ 00:16:42.068 05:44:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:42.068 05:44:56 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:42.068 05:44:56 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:16:42.068 05:44:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:42.068 05:44:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:42.068 05:44:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:42.068 ************************************ 00:16:42.068 START TEST raid_state_function_test 00:16:42.068 ************************************ 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1160454 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1160454' 00:16:42.068 Process raid pid: 1160454 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1160454 /var/tmp/spdk-raid.sock 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1160454 ']' 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:42.068 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:42.068 05:44:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.068 [2024-07-26 05:44:56.777314] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:16:42.068 [2024-07-26 05:44:56.777368] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:42.068 [2024-07-26 05:44:56.891575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:42.331 [2024-07-26 05:44:56.994781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:42.331 [2024-07-26 05:44:57.062575] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:42.331 [2024-07-26 05:44:57.062612] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:42.898 05:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:42.898 05:44:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:42.898 05:44:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:43.466 [2024-07-26 05:44:58.197427] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:43.466 [2024-07-26 05:44:58.197472] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:43.466 [2024-07-26 05:44:58.197483] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:43.466 [2024-07-26 05:44:58.197495] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:43.466 [2024-07-26 05:44:58.197504] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:43.466 [2024-07-26 05:44:58.197515] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.466 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.725 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.725 "name": "Existed_Raid", 00:16:43.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.725 "strip_size_kb": 64, 00:16:43.725 "state": "configuring", 00:16:43.725 "raid_level": "concat", 00:16:43.725 "superblock": false, 00:16:43.725 "num_base_bdevs": 3, 00:16:43.725 "num_base_bdevs_discovered": 0, 00:16:43.725 "num_base_bdevs_operational": 3, 00:16:43.726 "base_bdevs_list": [ 00:16:43.726 { 00:16:43.726 "name": "BaseBdev1", 00:16:43.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.726 "is_configured": false, 00:16:43.726 "data_offset": 0, 00:16:43.726 "data_size": 0 00:16:43.726 }, 00:16:43.726 { 00:16:43.726 "name": "BaseBdev2", 00:16:43.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.726 "is_configured": false, 00:16:43.726 "data_offset": 0, 00:16:43.726 "data_size": 0 00:16:43.726 }, 00:16:43.726 { 00:16:43.726 "name": "BaseBdev3", 00:16:43.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.726 "is_configured": false, 00:16:43.726 "data_offset": 0, 00:16:43.726 "data_size": 0 00:16:43.726 } 00:16:43.726 ] 00:16:43.726 }' 00:16:43.726 05:44:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.726 05:44:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.292 05:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:44.551 [2024-07-26 05:44:59.207971] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:44.551 [2024-07-26 05:44:59.208002] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24c6a80 name Existed_Raid, state configuring 00:16:44.551 05:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:44.551 [2024-07-26 05:44:59.448617] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:44.551 [2024-07-26 05:44:59.448653] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:44.551 [2024-07-26 05:44:59.448663] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:44.551 [2024-07-26 05:44:59.448675] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:44.551 [2024-07-26 05:44:59.448683] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:44.551 [2024-07-26 05:44:59.448694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:44.810 05:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:44.810 [2024-07-26 05:44:59.703130] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:44.810 BaseBdev1 00:16:45.069 05:44:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:45.069 05:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:45.069 05:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:45.069 05:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:45.069 05:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:45.069 05:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:45.069 05:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.069 05:44:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:45.328 [ 00:16:45.328 { 00:16:45.328 "name": "BaseBdev1", 00:16:45.328 "aliases": [ 00:16:45.328 "cf5759e3-4275-4668-ad7a-186a5bf802dc" 00:16:45.328 ], 00:16:45.328 "product_name": "Malloc disk", 00:16:45.328 "block_size": 512, 00:16:45.328 "num_blocks": 65536, 00:16:45.328 "uuid": "cf5759e3-4275-4668-ad7a-186a5bf802dc", 00:16:45.328 "assigned_rate_limits": { 00:16:45.328 "rw_ios_per_sec": 0, 00:16:45.328 "rw_mbytes_per_sec": 0, 00:16:45.328 "r_mbytes_per_sec": 0, 00:16:45.328 "w_mbytes_per_sec": 0 00:16:45.328 }, 00:16:45.328 "claimed": true, 00:16:45.328 "claim_type": "exclusive_write", 00:16:45.328 "zoned": false, 00:16:45.328 "supported_io_types": { 00:16:45.328 "read": true, 00:16:45.328 "write": true, 00:16:45.328 "unmap": true, 00:16:45.328 "flush": true, 00:16:45.328 "reset": true, 00:16:45.328 "nvme_admin": false, 00:16:45.328 "nvme_io": false, 00:16:45.328 "nvme_io_md": false, 00:16:45.328 "write_zeroes": true, 00:16:45.328 "zcopy": true, 00:16:45.328 "get_zone_info": false, 00:16:45.328 "zone_management": false, 00:16:45.328 "zone_append": false, 00:16:45.328 "compare": false, 00:16:45.328 "compare_and_write": false, 00:16:45.328 "abort": true, 00:16:45.328 "seek_hole": false, 00:16:45.328 "seek_data": false, 00:16:45.328 "copy": true, 00:16:45.328 "nvme_iov_md": false 00:16:45.328 }, 00:16:45.328 "memory_domains": [ 00:16:45.328 { 00:16:45.328 "dma_device_id": "system", 00:16:45.328 "dma_device_type": 1 00:16:45.328 }, 00:16:45.328 { 00:16:45.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.328 "dma_device_type": 2 00:16:45.328 } 00:16:45.328 ], 00:16:45.328 "driver_specific": {} 00:16:45.328 } 00:16:45.328 ] 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.328 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.587 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.587 "name": "Existed_Raid", 00:16:45.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.587 "strip_size_kb": 64, 00:16:45.587 "state": "configuring", 00:16:45.587 "raid_level": "concat", 00:16:45.587 "superblock": false, 00:16:45.587 "num_base_bdevs": 3, 00:16:45.587 "num_base_bdevs_discovered": 1, 00:16:45.587 "num_base_bdevs_operational": 3, 00:16:45.587 "base_bdevs_list": [ 00:16:45.587 { 00:16:45.587 "name": "BaseBdev1", 00:16:45.587 "uuid": "cf5759e3-4275-4668-ad7a-186a5bf802dc", 00:16:45.587 "is_configured": true, 00:16:45.587 "data_offset": 0, 00:16:45.587 "data_size": 65536 00:16:45.587 }, 00:16:45.587 { 00:16:45.587 "name": "BaseBdev2", 00:16:45.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.587 "is_configured": false, 00:16:45.587 "data_offset": 0, 00:16:45.587 "data_size": 0 00:16:45.587 }, 00:16:45.587 { 00:16:45.587 "name": "BaseBdev3", 00:16:45.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.587 "is_configured": false, 00:16:45.587 "data_offset": 0, 00:16:45.587 "data_size": 0 00:16:45.587 } 00:16:45.587 ] 00:16:45.587 }' 00:16:45.587 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.587 05:45:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.154 05:45:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:46.721 [2024-07-26 05:45:01.475832] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:46.721 [2024-07-26 05:45:01.475874] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24c6310 name Existed_Raid, state configuring 00:16:46.721 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:46.979 [2024-07-26 05:45:01.732546] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:46.979 [2024-07-26 05:45:01.734059] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:46.979 [2024-07-26 05:45:01.734098] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:46.979 [2024-07-26 05:45:01.734108] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:46.979 [2024-07-26 05:45:01.734120] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:46.979 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:46.979 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:46.979 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:46.979 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.979 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.979 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:46.979 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:46.980 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.980 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.980 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.980 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.980 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.980 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.980 05:45:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.238 05:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.238 "name": "Existed_Raid", 00:16:47.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.238 "strip_size_kb": 64, 00:16:47.238 "state": "configuring", 00:16:47.238 "raid_level": "concat", 00:16:47.238 "superblock": false, 00:16:47.238 "num_base_bdevs": 3, 00:16:47.238 "num_base_bdevs_discovered": 1, 00:16:47.238 "num_base_bdevs_operational": 3, 00:16:47.238 "base_bdevs_list": [ 00:16:47.238 { 00:16:47.238 "name": "BaseBdev1", 00:16:47.238 "uuid": "cf5759e3-4275-4668-ad7a-186a5bf802dc", 00:16:47.238 "is_configured": true, 00:16:47.238 "data_offset": 0, 00:16:47.238 "data_size": 65536 00:16:47.238 }, 00:16:47.238 { 00:16:47.238 "name": "BaseBdev2", 00:16:47.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.238 "is_configured": false, 00:16:47.238 "data_offset": 0, 00:16:47.238 "data_size": 0 00:16:47.238 }, 00:16:47.238 { 00:16:47.238 "name": "BaseBdev3", 00:16:47.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.238 "is_configured": false, 00:16:47.238 "data_offset": 0, 00:16:47.238 "data_size": 0 00:16:47.238 } 00:16:47.238 ] 00:16:47.238 }' 00:16:47.238 05:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.238 05:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.804 05:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:48.063 [2024-07-26 05:45:02.830851] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:48.063 BaseBdev2 00:16:48.063 05:45:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:48.063 05:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:48.063 05:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:48.063 05:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:48.063 05:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:48.063 05:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:48.063 05:45:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.321 05:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:48.593 [ 00:16:48.593 { 00:16:48.593 "name": "BaseBdev2", 00:16:48.593 "aliases": [ 00:16:48.593 "43c7e974-8bd4-4d74-9e99-e873ec8d7553" 00:16:48.593 ], 00:16:48.593 "product_name": "Malloc disk", 00:16:48.593 "block_size": 512, 00:16:48.593 "num_blocks": 65536, 00:16:48.593 "uuid": "43c7e974-8bd4-4d74-9e99-e873ec8d7553", 00:16:48.593 "assigned_rate_limits": { 00:16:48.593 "rw_ios_per_sec": 0, 00:16:48.593 "rw_mbytes_per_sec": 0, 00:16:48.593 "r_mbytes_per_sec": 0, 00:16:48.593 "w_mbytes_per_sec": 0 00:16:48.593 }, 00:16:48.593 "claimed": true, 00:16:48.593 "claim_type": "exclusive_write", 00:16:48.593 "zoned": false, 00:16:48.593 "supported_io_types": { 00:16:48.593 "read": true, 00:16:48.594 "write": true, 00:16:48.594 "unmap": true, 00:16:48.594 "flush": true, 00:16:48.594 "reset": true, 00:16:48.594 "nvme_admin": false, 00:16:48.594 "nvme_io": false, 00:16:48.594 "nvme_io_md": false, 00:16:48.594 "write_zeroes": true, 00:16:48.594 "zcopy": true, 00:16:48.594 "get_zone_info": false, 00:16:48.594 "zone_management": false, 00:16:48.594 "zone_append": false, 00:16:48.594 "compare": false, 00:16:48.594 "compare_and_write": false, 00:16:48.594 "abort": true, 00:16:48.594 "seek_hole": false, 00:16:48.594 "seek_data": false, 00:16:48.594 "copy": true, 00:16:48.594 "nvme_iov_md": false 00:16:48.594 }, 00:16:48.594 "memory_domains": [ 00:16:48.594 { 00:16:48.594 "dma_device_id": "system", 00:16:48.594 "dma_device_type": 1 00:16:48.594 }, 00:16:48.594 { 00:16:48.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.594 "dma_device_type": 2 00:16:48.594 } 00:16:48.594 ], 00:16:48.594 "driver_specific": {} 00:16:48.594 } 00:16:48.594 ] 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.595 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.595 "name": "Existed_Raid", 00:16:48.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.595 "strip_size_kb": 64, 00:16:48.595 "state": "configuring", 00:16:48.595 "raid_level": "concat", 00:16:48.595 "superblock": false, 00:16:48.595 "num_base_bdevs": 3, 00:16:48.595 "num_base_bdevs_discovered": 2, 00:16:48.595 "num_base_bdevs_operational": 3, 00:16:48.596 "base_bdevs_list": [ 00:16:48.596 { 00:16:48.596 "name": "BaseBdev1", 00:16:48.596 "uuid": "cf5759e3-4275-4668-ad7a-186a5bf802dc", 00:16:48.596 "is_configured": true, 00:16:48.596 "data_offset": 0, 00:16:48.596 "data_size": 65536 00:16:48.596 }, 00:16:48.596 { 00:16:48.596 "name": "BaseBdev2", 00:16:48.596 "uuid": "43c7e974-8bd4-4d74-9e99-e873ec8d7553", 00:16:48.596 "is_configured": true, 00:16:48.596 "data_offset": 0, 00:16:48.596 "data_size": 65536 00:16:48.596 }, 00:16:48.596 { 00:16:48.596 "name": "BaseBdev3", 00:16:48.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.596 "is_configured": false, 00:16:48.596 "data_offset": 0, 00:16:48.596 "data_size": 0 00:16:48.596 } 00:16:48.596 ] 00:16:48.596 }' 00:16:48.596 05:45:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.596 05:45:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.176 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:49.435 [2024-07-26 05:45:04.193935] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:49.435 [2024-07-26 05:45:04.193972] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24c7400 00:16:49.435 [2024-07-26 05:45:04.193981] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:16:49.435 [2024-07-26 05:45:04.194228] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24c6ef0 00:16:49.435 [2024-07-26 05:45:04.194347] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24c7400 00:16:49.435 [2024-07-26 05:45:04.194356] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24c7400 00:16:49.435 [2024-07-26 05:45:04.194518] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:49.435 BaseBdev3 00:16:49.435 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:49.435 05:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:49.435 05:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:49.435 05:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:49.435 05:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:49.435 05:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:49.435 05:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:49.692 05:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:49.951 [ 00:16:49.951 { 00:16:49.951 "name": "BaseBdev3", 00:16:49.951 "aliases": [ 00:16:49.951 "084eeab2-2ac3-4168-aa35-13ee43ccd3da" 00:16:49.951 ], 00:16:49.951 "product_name": "Malloc disk", 00:16:49.951 "block_size": 512, 00:16:49.951 "num_blocks": 65536, 00:16:49.951 "uuid": "084eeab2-2ac3-4168-aa35-13ee43ccd3da", 00:16:49.951 "assigned_rate_limits": { 00:16:49.951 "rw_ios_per_sec": 0, 00:16:49.951 "rw_mbytes_per_sec": 0, 00:16:49.951 "r_mbytes_per_sec": 0, 00:16:49.951 "w_mbytes_per_sec": 0 00:16:49.951 }, 00:16:49.951 "claimed": true, 00:16:49.951 "claim_type": "exclusive_write", 00:16:49.951 "zoned": false, 00:16:49.951 "supported_io_types": { 00:16:49.951 "read": true, 00:16:49.951 "write": true, 00:16:49.951 "unmap": true, 00:16:49.951 "flush": true, 00:16:49.951 "reset": true, 00:16:49.951 "nvme_admin": false, 00:16:49.951 "nvme_io": false, 00:16:49.951 "nvme_io_md": false, 00:16:49.951 "write_zeroes": true, 00:16:49.951 "zcopy": true, 00:16:49.951 "get_zone_info": false, 00:16:49.951 "zone_management": false, 00:16:49.951 "zone_append": false, 00:16:49.951 "compare": false, 00:16:49.951 "compare_and_write": false, 00:16:49.951 "abort": true, 00:16:49.951 "seek_hole": false, 00:16:49.951 "seek_data": false, 00:16:49.951 "copy": true, 00:16:49.951 "nvme_iov_md": false 00:16:49.951 }, 00:16:49.951 "memory_domains": [ 00:16:49.951 { 00:16:49.951 "dma_device_id": "system", 00:16:49.951 "dma_device_type": 1 00:16:49.951 }, 00:16:49.951 { 00:16:49.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.951 "dma_device_type": 2 00:16:49.951 } 00:16:49.951 ], 00:16:49.951 "driver_specific": {} 00:16:49.951 } 00:16:49.951 ] 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.951 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.952 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.952 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.210 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.210 "name": "Existed_Raid", 00:16:50.210 "uuid": "d084254d-a76e-4e1e-8bad-c9221d379f47", 00:16:50.210 "strip_size_kb": 64, 00:16:50.210 "state": "online", 00:16:50.210 "raid_level": "concat", 00:16:50.210 "superblock": false, 00:16:50.210 "num_base_bdevs": 3, 00:16:50.210 "num_base_bdevs_discovered": 3, 00:16:50.210 "num_base_bdevs_operational": 3, 00:16:50.210 "base_bdevs_list": [ 00:16:50.210 { 00:16:50.210 "name": "BaseBdev1", 00:16:50.210 "uuid": "cf5759e3-4275-4668-ad7a-186a5bf802dc", 00:16:50.210 "is_configured": true, 00:16:50.210 "data_offset": 0, 00:16:50.210 "data_size": 65536 00:16:50.210 }, 00:16:50.210 { 00:16:50.210 "name": "BaseBdev2", 00:16:50.210 "uuid": "43c7e974-8bd4-4d74-9e99-e873ec8d7553", 00:16:50.210 "is_configured": true, 00:16:50.210 "data_offset": 0, 00:16:50.210 "data_size": 65536 00:16:50.210 }, 00:16:50.210 { 00:16:50.210 "name": "BaseBdev3", 00:16:50.210 "uuid": "084eeab2-2ac3-4168-aa35-13ee43ccd3da", 00:16:50.210 "is_configured": true, 00:16:50.210 "data_offset": 0, 00:16:50.210 "data_size": 65536 00:16:50.210 } 00:16:50.210 ] 00:16:50.210 }' 00:16:50.210 05:45:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.210 05:45:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.775 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:50.775 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:50.775 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:50.775 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:50.776 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:50.776 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:50.776 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:50.776 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:51.033 [2024-07-26 05:45:05.726311] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:51.033 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:51.033 "name": "Existed_Raid", 00:16:51.033 "aliases": [ 00:16:51.033 "d084254d-a76e-4e1e-8bad-c9221d379f47" 00:16:51.033 ], 00:16:51.033 "product_name": "Raid Volume", 00:16:51.033 "block_size": 512, 00:16:51.033 "num_blocks": 196608, 00:16:51.033 "uuid": "d084254d-a76e-4e1e-8bad-c9221d379f47", 00:16:51.033 "assigned_rate_limits": { 00:16:51.033 "rw_ios_per_sec": 0, 00:16:51.033 "rw_mbytes_per_sec": 0, 00:16:51.033 "r_mbytes_per_sec": 0, 00:16:51.033 "w_mbytes_per_sec": 0 00:16:51.033 }, 00:16:51.033 "claimed": false, 00:16:51.033 "zoned": false, 00:16:51.033 "supported_io_types": { 00:16:51.033 "read": true, 00:16:51.033 "write": true, 00:16:51.033 "unmap": true, 00:16:51.033 "flush": true, 00:16:51.033 "reset": true, 00:16:51.033 "nvme_admin": false, 00:16:51.033 "nvme_io": false, 00:16:51.033 "nvme_io_md": false, 00:16:51.033 "write_zeroes": true, 00:16:51.033 "zcopy": false, 00:16:51.033 "get_zone_info": false, 00:16:51.033 "zone_management": false, 00:16:51.033 "zone_append": false, 00:16:51.033 "compare": false, 00:16:51.034 "compare_and_write": false, 00:16:51.034 "abort": false, 00:16:51.034 "seek_hole": false, 00:16:51.034 "seek_data": false, 00:16:51.034 "copy": false, 00:16:51.034 "nvme_iov_md": false 00:16:51.034 }, 00:16:51.034 "memory_domains": [ 00:16:51.034 { 00:16:51.034 "dma_device_id": "system", 00:16:51.034 "dma_device_type": 1 00:16:51.034 }, 00:16:51.034 { 00:16:51.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.034 "dma_device_type": 2 00:16:51.034 }, 00:16:51.034 { 00:16:51.034 "dma_device_id": "system", 00:16:51.034 "dma_device_type": 1 00:16:51.034 }, 00:16:51.034 { 00:16:51.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.034 "dma_device_type": 2 00:16:51.034 }, 00:16:51.034 { 00:16:51.034 "dma_device_id": "system", 00:16:51.034 "dma_device_type": 1 00:16:51.034 }, 00:16:51.034 { 00:16:51.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.034 "dma_device_type": 2 00:16:51.034 } 00:16:51.034 ], 00:16:51.034 "driver_specific": { 00:16:51.034 "raid": { 00:16:51.034 "uuid": "d084254d-a76e-4e1e-8bad-c9221d379f47", 00:16:51.034 "strip_size_kb": 64, 00:16:51.034 "state": "online", 00:16:51.034 "raid_level": "concat", 00:16:51.034 "superblock": false, 00:16:51.034 "num_base_bdevs": 3, 00:16:51.034 "num_base_bdevs_discovered": 3, 00:16:51.034 "num_base_bdevs_operational": 3, 00:16:51.034 "base_bdevs_list": [ 00:16:51.034 { 00:16:51.034 "name": "BaseBdev1", 00:16:51.034 "uuid": "cf5759e3-4275-4668-ad7a-186a5bf802dc", 00:16:51.034 "is_configured": true, 00:16:51.034 "data_offset": 0, 00:16:51.034 "data_size": 65536 00:16:51.034 }, 00:16:51.034 { 00:16:51.034 "name": "BaseBdev2", 00:16:51.034 "uuid": "43c7e974-8bd4-4d74-9e99-e873ec8d7553", 00:16:51.034 "is_configured": true, 00:16:51.034 "data_offset": 0, 00:16:51.034 "data_size": 65536 00:16:51.034 }, 00:16:51.034 { 00:16:51.034 "name": "BaseBdev3", 00:16:51.034 "uuid": "084eeab2-2ac3-4168-aa35-13ee43ccd3da", 00:16:51.034 "is_configured": true, 00:16:51.034 "data_offset": 0, 00:16:51.034 "data_size": 65536 00:16:51.034 } 00:16:51.034 ] 00:16:51.034 } 00:16:51.034 } 00:16:51.034 }' 00:16:51.034 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:51.034 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:51.034 BaseBdev2 00:16:51.034 BaseBdev3' 00:16:51.034 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.034 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:51.034 05:45:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.292 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.292 "name": "BaseBdev1", 00:16:51.292 "aliases": [ 00:16:51.292 "cf5759e3-4275-4668-ad7a-186a5bf802dc" 00:16:51.292 ], 00:16:51.292 "product_name": "Malloc disk", 00:16:51.292 "block_size": 512, 00:16:51.292 "num_blocks": 65536, 00:16:51.292 "uuid": "cf5759e3-4275-4668-ad7a-186a5bf802dc", 00:16:51.292 "assigned_rate_limits": { 00:16:51.292 "rw_ios_per_sec": 0, 00:16:51.292 "rw_mbytes_per_sec": 0, 00:16:51.292 "r_mbytes_per_sec": 0, 00:16:51.292 "w_mbytes_per_sec": 0 00:16:51.292 }, 00:16:51.292 "claimed": true, 00:16:51.292 "claim_type": "exclusive_write", 00:16:51.292 "zoned": false, 00:16:51.292 "supported_io_types": { 00:16:51.292 "read": true, 00:16:51.292 "write": true, 00:16:51.292 "unmap": true, 00:16:51.292 "flush": true, 00:16:51.293 "reset": true, 00:16:51.293 "nvme_admin": false, 00:16:51.293 "nvme_io": false, 00:16:51.293 "nvme_io_md": false, 00:16:51.293 "write_zeroes": true, 00:16:51.293 "zcopy": true, 00:16:51.293 "get_zone_info": false, 00:16:51.293 "zone_management": false, 00:16:51.293 "zone_append": false, 00:16:51.293 "compare": false, 00:16:51.293 "compare_and_write": false, 00:16:51.293 "abort": true, 00:16:51.293 "seek_hole": false, 00:16:51.293 "seek_data": false, 00:16:51.293 "copy": true, 00:16:51.293 "nvme_iov_md": false 00:16:51.293 }, 00:16:51.293 "memory_domains": [ 00:16:51.293 { 00:16:51.293 "dma_device_id": "system", 00:16:51.293 "dma_device_type": 1 00:16:51.293 }, 00:16:51.293 { 00:16:51.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.293 "dma_device_type": 2 00:16:51.293 } 00:16:51.293 ], 00:16:51.293 "driver_specific": {} 00:16:51.293 }' 00:16:51.293 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.293 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.293 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:51.293 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.293 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:51.551 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.809 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.809 "name": "BaseBdev2", 00:16:51.809 "aliases": [ 00:16:51.809 "43c7e974-8bd4-4d74-9e99-e873ec8d7553" 00:16:51.809 ], 00:16:51.809 "product_name": "Malloc disk", 00:16:51.809 "block_size": 512, 00:16:51.809 "num_blocks": 65536, 00:16:51.809 "uuid": "43c7e974-8bd4-4d74-9e99-e873ec8d7553", 00:16:51.809 "assigned_rate_limits": { 00:16:51.809 "rw_ios_per_sec": 0, 00:16:51.809 "rw_mbytes_per_sec": 0, 00:16:51.809 "r_mbytes_per_sec": 0, 00:16:51.809 "w_mbytes_per_sec": 0 00:16:51.809 }, 00:16:51.809 "claimed": true, 00:16:51.809 "claim_type": "exclusive_write", 00:16:51.809 "zoned": false, 00:16:51.809 "supported_io_types": { 00:16:51.809 "read": true, 00:16:51.809 "write": true, 00:16:51.809 "unmap": true, 00:16:51.809 "flush": true, 00:16:51.809 "reset": true, 00:16:51.809 "nvme_admin": false, 00:16:51.809 "nvme_io": false, 00:16:51.809 "nvme_io_md": false, 00:16:51.809 "write_zeroes": true, 00:16:51.809 "zcopy": true, 00:16:51.809 "get_zone_info": false, 00:16:51.809 "zone_management": false, 00:16:51.809 "zone_append": false, 00:16:51.809 "compare": false, 00:16:51.809 "compare_and_write": false, 00:16:51.809 "abort": true, 00:16:51.809 "seek_hole": false, 00:16:51.809 "seek_data": false, 00:16:51.809 "copy": true, 00:16:51.809 "nvme_iov_md": false 00:16:51.809 }, 00:16:51.809 "memory_domains": [ 00:16:51.809 { 00:16:51.809 "dma_device_id": "system", 00:16:51.809 "dma_device_type": 1 00:16:51.809 }, 00:16:51.809 { 00:16:51.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.809 "dma_device_type": 2 00:16:51.809 } 00:16:51.809 ], 00:16:51.809 "driver_specific": {} 00:16:51.809 }' 00:16:51.809 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.809 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.809 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:51.809 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.067 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.067 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:52.067 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.067 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.067 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.067 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.067 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.067 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.067 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.068 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:52.068 05:45:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:52.326 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:52.326 "name": "BaseBdev3", 00:16:52.326 "aliases": [ 00:16:52.326 "084eeab2-2ac3-4168-aa35-13ee43ccd3da" 00:16:52.326 ], 00:16:52.326 "product_name": "Malloc disk", 00:16:52.326 "block_size": 512, 00:16:52.326 "num_blocks": 65536, 00:16:52.326 "uuid": "084eeab2-2ac3-4168-aa35-13ee43ccd3da", 00:16:52.326 "assigned_rate_limits": { 00:16:52.326 "rw_ios_per_sec": 0, 00:16:52.326 "rw_mbytes_per_sec": 0, 00:16:52.326 "r_mbytes_per_sec": 0, 00:16:52.326 "w_mbytes_per_sec": 0 00:16:52.326 }, 00:16:52.326 "claimed": true, 00:16:52.326 "claim_type": "exclusive_write", 00:16:52.326 "zoned": false, 00:16:52.326 "supported_io_types": { 00:16:52.326 "read": true, 00:16:52.326 "write": true, 00:16:52.326 "unmap": true, 00:16:52.326 "flush": true, 00:16:52.326 "reset": true, 00:16:52.326 "nvme_admin": false, 00:16:52.326 "nvme_io": false, 00:16:52.326 "nvme_io_md": false, 00:16:52.326 "write_zeroes": true, 00:16:52.326 "zcopy": true, 00:16:52.326 "get_zone_info": false, 00:16:52.326 "zone_management": false, 00:16:52.326 "zone_append": false, 00:16:52.326 "compare": false, 00:16:52.326 "compare_and_write": false, 00:16:52.326 "abort": true, 00:16:52.326 "seek_hole": false, 00:16:52.326 "seek_data": false, 00:16:52.326 "copy": true, 00:16:52.326 "nvme_iov_md": false 00:16:52.326 }, 00:16:52.326 "memory_domains": [ 00:16:52.326 { 00:16:52.326 "dma_device_id": "system", 00:16:52.326 "dma_device_type": 1 00:16:52.326 }, 00:16:52.326 { 00:16:52.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.326 "dma_device_type": 2 00:16:52.326 } 00:16:52.326 ], 00:16:52.326 "driver_specific": {} 00:16:52.326 }' 00:16:52.326 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.585 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.585 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.585 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.585 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.585 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:52.585 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.585 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.585 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.585 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.843 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.843 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.843 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:53.101 [2024-07-26 05:45:07.791539] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:53.101 [2024-07-26 05:45:07.791567] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:53.101 [2024-07-26 05:45:07.791610] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:53.101 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:53.102 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:53.102 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.102 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.102 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.102 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.102 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.102 05:45:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.360 05:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.360 "name": "Existed_Raid", 00:16:53.360 "uuid": "d084254d-a76e-4e1e-8bad-c9221d379f47", 00:16:53.360 "strip_size_kb": 64, 00:16:53.360 "state": "offline", 00:16:53.360 "raid_level": "concat", 00:16:53.360 "superblock": false, 00:16:53.360 "num_base_bdevs": 3, 00:16:53.360 "num_base_bdevs_discovered": 2, 00:16:53.360 "num_base_bdevs_operational": 2, 00:16:53.360 "base_bdevs_list": [ 00:16:53.360 { 00:16:53.360 "name": null, 00:16:53.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.360 "is_configured": false, 00:16:53.360 "data_offset": 0, 00:16:53.360 "data_size": 65536 00:16:53.360 }, 00:16:53.360 { 00:16:53.360 "name": "BaseBdev2", 00:16:53.360 "uuid": "43c7e974-8bd4-4d74-9e99-e873ec8d7553", 00:16:53.360 "is_configured": true, 00:16:53.360 "data_offset": 0, 00:16:53.360 "data_size": 65536 00:16:53.360 }, 00:16:53.360 { 00:16:53.360 "name": "BaseBdev3", 00:16:53.360 "uuid": "084eeab2-2ac3-4168-aa35-13ee43ccd3da", 00:16:53.360 "is_configured": true, 00:16:53.360 "data_offset": 0, 00:16:53.360 "data_size": 65536 00:16:53.360 } 00:16:53.360 ] 00:16:53.360 }' 00:16:53.360 05:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.360 05:45:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:53.926 05:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:53.926 05:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:53.926 05:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:53.926 05:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.184 05:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:54.184 05:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:54.184 05:45:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:54.751 [2024-07-26 05:45:09.384776] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:54.751 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:54.751 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:54.751 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.751 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:55.009 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:55.009 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:55.009 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:55.009 [2024-07-26 05:45:09.896686] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:55.009 [2024-07-26 05:45:09.896738] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24c7400 name Existed_Raid, state offline 00:16:55.267 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:55.268 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:55.268 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.268 05:45:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:55.268 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:55.526 BaseBdev2 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:55.526 05:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:55.784 05:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:56.042 [ 00:16:56.042 { 00:16:56.042 "name": "BaseBdev2", 00:16:56.042 "aliases": [ 00:16:56.042 "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc" 00:16:56.042 ], 00:16:56.042 "product_name": "Malloc disk", 00:16:56.042 "block_size": 512, 00:16:56.042 "num_blocks": 65536, 00:16:56.042 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:16:56.042 "assigned_rate_limits": { 00:16:56.042 "rw_ios_per_sec": 0, 00:16:56.042 "rw_mbytes_per_sec": 0, 00:16:56.042 "r_mbytes_per_sec": 0, 00:16:56.042 "w_mbytes_per_sec": 0 00:16:56.042 }, 00:16:56.042 "claimed": false, 00:16:56.042 "zoned": false, 00:16:56.042 "supported_io_types": { 00:16:56.042 "read": true, 00:16:56.042 "write": true, 00:16:56.042 "unmap": true, 00:16:56.042 "flush": true, 00:16:56.042 "reset": true, 00:16:56.042 "nvme_admin": false, 00:16:56.042 "nvme_io": false, 00:16:56.042 "nvme_io_md": false, 00:16:56.042 "write_zeroes": true, 00:16:56.042 "zcopy": true, 00:16:56.042 "get_zone_info": false, 00:16:56.042 "zone_management": false, 00:16:56.042 "zone_append": false, 00:16:56.042 "compare": false, 00:16:56.042 "compare_and_write": false, 00:16:56.042 "abort": true, 00:16:56.042 "seek_hole": false, 00:16:56.042 "seek_data": false, 00:16:56.042 "copy": true, 00:16:56.042 "nvme_iov_md": false 00:16:56.042 }, 00:16:56.042 "memory_domains": [ 00:16:56.042 { 00:16:56.042 "dma_device_id": "system", 00:16:56.042 "dma_device_type": 1 00:16:56.042 }, 00:16:56.042 { 00:16:56.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.042 "dma_device_type": 2 00:16:56.042 } 00:16:56.042 ], 00:16:56.042 "driver_specific": {} 00:16:56.042 } 00:16:56.042 ] 00:16:56.042 05:45:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:56.042 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:56.042 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:56.042 05:45:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:56.301 BaseBdev3 00:16:56.301 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:56.301 05:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:56.301 05:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:56.301 05:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:56.301 05:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:56.301 05:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:56.301 05:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:56.560 05:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:56.818 [ 00:16:56.818 { 00:16:56.818 "name": "BaseBdev3", 00:16:56.818 "aliases": [ 00:16:56.818 "b41512ee-f0ba-4832-a0d2-57edd880c658" 00:16:56.818 ], 00:16:56.818 "product_name": "Malloc disk", 00:16:56.818 "block_size": 512, 00:16:56.818 "num_blocks": 65536, 00:16:56.818 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:16:56.818 "assigned_rate_limits": { 00:16:56.818 "rw_ios_per_sec": 0, 00:16:56.818 "rw_mbytes_per_sec": 0, 00:16:56.818 "r_mbytes_per_sec": 0, 00:16:56.818 "w_mbytes_per_sec": 0 00:16:56.818 }, 00:16:56.818 "claimed": false, 00:16:56.818 "zoned": false, 00:16:56.818 "supported_io_types": { 00:16:56.818 "read": true, 00:16:56.818 "write": true, 00:16:56.818 "unmap": true, 00:16:56.818 "flush": true, 00:16:56.818 "reset": true, 00:16:56.818 "nvme_admin": false, 00:16:56.818 "nvme_io": false, 00:16:56.818 "nvme_io_md": false, 00:16:56.818 "write_zeroes": true, 00:16:56.818 "zcopy": true, 00:16:56.818 "get_zone_info": false, 00:16:56.818 "zone_management": false, 00:16:56.818 "zone_append": false, 00:16:56.818 "compare": false, 00:16:56.818 "compare_and_write": false, 00:16:56.818 "abort": true, 00:16:56.818 "seek_hole": false, 00:16:56.818 "seek_data": false, 00:16:56.818 "copy": true, 00:16:56.818 "nvme_iov_md": false 00:16:56.818 }, 00:16:56.818 "memory_domains": [ 00:16:56.818 { 00:16:56.818 "dma_device_id": "system", 00:16:56.818 "dma_device_type": 1 00:16:56.818 }, 00:16:56.818 { 00:16:56.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.818 "dma_device_type": 2 00:16:56.818 } 00:16:56.818 ], 00:16:56.818 "driver_specific": {} 00:16:56.818 } 00:16:56.818 ] 00:16:56.818 05:45:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:56.818 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:56.818 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:56.818 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:57.076 [2024-07-26 05:45:11.874752] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:57.076 [2024-07-26 05:45:11.874799] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:57.076 [2024-07-26 05:45:11.874818] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:57.076 [2024-07-26 05:45:11.876191] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.076 05:45:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.334 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.334 "name": "Existed_Raid", 00:16:57.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.334 "strip_size_kb": 64, 00:16:57.334 "state": "configuring", 00:16:57.334 "raid_level": "concat", 00:16:57.334 "superblock": false, 00:16:57.334 "num_base_bdevs": 3, 00:16:57.334 "num_base_bdevs_discovered": 2, 00:16:57.334 "num_base_bdevs_operational": 3, 00:16:57.334 "base_bdevs_list": [ 00:16:57.334 { 00:16:57.334 "name": "BaseBdev1", 00:16:57.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.334 "is_configured": false, 00:16:57.334 "data_offset": 0, 00:16:57.334 "data_size": 0 00:16:57.334 }, 00:16:57.334 { 00:16:57.334 "name": "BaseBdev2", 00:16:57.334 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:16:57.334 "is_configured": true, 00:16:57.334 "data_offset": 0, 00:16:57.334 "data_size": 65536 00:16:57.334 }, 00:16:57.334 { 00:16:57.334 "name": "BaseBdev3", 00:16:57.334 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:16:57.334 "is_configured": true, 00:16:57.334 "data_offset": 0, 00:16:57.334 "data_size": 65536 00:16:57.334 } 00:16:57.334 ] 00:16:57.334 }' 00:16:57.334 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.334 05:45:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:57.912 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:58.191 [2024-07-26 05:45:12.901461] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.191 05:45:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.449 05:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.449 "name": "Existed_Raid", 00:16:58.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.449 "strip_size_kb": 64, 00:16:58.449 "state": "configuring", 00:16:58.449 "raid_level": "concat", 00:16:58.449 "superblock": false, 00:16:58.449 "num_base_bdevs": 3, 00:16:58.449 "num_base_bdevs_discovered": 1, 00:16:58.449 "num_base_bdevs_operational": 3, 00:16:58.449 "base_bdevs_list": [ 00:16:58.449 { 00:16:58.449 "name": "BaseBdev1", 00:16:58.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.449 "is_configured": false, 00:16:58.449 "data_offset": 0, 00:16:58.449 "data_size": 0 00:16:58.449 }, 00:16:58.449 { 00:16:58.449 "name": null, 00:16:58.449 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:16:58.449 "is_configured": false, 00:16:58.449 "data_offset": 0, 00:16:58.449 "data_size": 65536 00:16:58.449 }, 00:16:58.449 { 00:16:58.449 "name": "BaseBdev3", 00:16:58.449 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:16:58.449 "is_configured": true, 00:16:58.449 "data_offset": 0, 00:16:58.449 "data_size": 65536 00:16:58.449 } 00:16:58.449 ] 00:16:58.449 }' 00:16:58.449 05:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.449 05:45:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.037 05:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.037 05:45:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:59.294 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:59.294 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:59.553 [2024-07-26 05:45:14.256384] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:59.553 BaseBdev1 00:16:59.553 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:59.553 05:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:59.553 05:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:59.553 05:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:59.553 05:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:59.553 05:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:59.553 05:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:59.810 05:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:00.069 [ 00:17:00.069 { 00:17:00.069 "name": "BaseBdev1", 00:17:00.069 "aliases": [ 00:17:00.069 "85ceae7e-a9cc-442f-9d13-967631d2d83c" 00:17:00.069 ], 00:17:00.069 "product_name": "Malloc disk", 00:17:00.069 "block_size": 512, 00:17:00.069 "num_blocks": 65536, 00:17:00.069 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:00.069 "assigned_rate_limits": { 00:17:00.069 "rw_ios_per_sec": 0, 00:17:00.069 "rw_mbytes_per_sec": 0, 00:17:00.069 "r_mbytes_per_sec": 0, 00:17:00.069 "w_mbytes_per_sec": 0 00:17:00.069 }, 00:17:00.069 "claimed": true, 00:17:00.069 "claim_type": "exclusive_write", 00:17:00.069 "zoned": false, 00:17:00.069 "supported_io_types": { 00:17:00.069 "read": true, 00:17:00.069 "write": true, 00:17:00.069 "unmap": true, 00:17:00.069 "flush": true, 00:17:00.069 "reset": true, 00:17:00.069 "nvme_admin": false, 00:17:00.069 "nvme_io": false, 00:17:00.069 "nvme_io_md": false, 00:17:00.069 "write_zeroes": true, 00:17:00.069 "zcopy": true, 00:17:00.069 "get_zone_info": false, 00:17:00.069 "zone_management": false, 00:17:00.069 "zone_append": false, 00:17:00.069 "compare": false, 00:17:00.069 "compare_and_write": false, 00:17:00.069 "abort": true, 00:17:00.069 "seek_hole": false, 00:17:00.069 "seek_data": false, 00:17:00.069 "copy": true, 00:17:00.069 "nvme_iov_md": false 00:17:00.069 }, 00:17:00.069 "memory_domains": [ 00:17:00.069 { 00:17:00.069 "dma_device_id": "system", 00:17:00.069 "dma_device_type": 1 00:17:00.069 }, 00:17:00.069 { 00:17:00.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.069 "dma_device_type": 2 00:17:00.069 } 00:17:00.069 ], 00:17:00.069 "driver_specific": {} 00:17:00.069 } 00:17:00.069 ] 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.069 05:45:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:00.328 05:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.328 "name": "Existed_Raid", 00:17:00.328 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.328 "strip_size_kb": 64, 00:17:00.328 "state": "configuring", 00:17:00.328 "raid_level": "concat", 00:17:00.328 "superblock": false, 00:17:00.328 "num_base_bdevs": 3, 00:17:00.328 "num_base_bdevs_discovered": 2, 00:17:00.328 "num_base_bdevs_operational": 3, 00:17:00.328 "base_bdevs_list": [ 00:17:00.328 { 00:17:00.328 "name": "BaseBdev1", 00:17:00.328 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:00.328 "is_configured": true, 00:17:00.328 "data_offset": 0, 00:17:00.328 "data_size": 65536 00:17:00.328 }, 00:17:00.328 { 00:17:00.328 "name": null, 00:17:00.328 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:17:00.328 "is_configured": false, 00:17:00.328 "data_offset": 0, 00:17:00.328 "data_size": 65536 00:17:00.328 }, 00:17:00.328 { 00:17:00.328 "name": "BaseBdev3", 00:17:00.328 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:17:00.328 "is_configured": true, 00:17:00.328 "data_offset": 0, 00:17:00.328 "data_size": 65536 00:17:00.328 } 00:17:00.328 ] 00:17:00.328 }' 00:17:00.328 05:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.328 05:45:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.893 05:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.893 05:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:01.151 05:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:01.151 05:45:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:01.409 [2024-07-26 05:45:16.117342] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.409 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.667 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.667 "name": "Existed_Raid", 00:17:01.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.667 "strip_size_kb": 64, 00:17:01.667 "state": "configuring", 00:17:01.667 "raid_level": "concat", 00:17:01.667 "superblock": false, 00:17:01.667 "num_base_bdevs": 3, 00:17:01.667 "num_base_bdevs_discovered": 1, 00:17:01.667 "num_base_bdevs_operational": 3, 00:17:01.667 "base_bdevs_list": [ 00:17:01.667 { 00:17:01.667 "name": "BaseBdev1", 00:17:01.667 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:01.667 "is_configured": true, 00:17:01.667 "data_offset": 0, 00:17:01.667 "data_size": 65536 00:17:01.667 }, 00:17:01.667 { 00:17:01.667 "name": null, 00:17:01.667 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:17:01.667 "is_configured": false, 00:17:01.667 "data_offset": 0, 00:17:01.667 "data_size": 65536 00:17:01.667 }, 00:17:01.667 { 00:17:01.667 "name": null, 00:17:01.667 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:17:01.667 "is_configured": false, 00:17:01.667 "data_offset": 0, 00:17:01.667 "data_size": 65536 00:17:01.667 } 00:17:01.667 ] 00:17:01.667 }' 00:17:01.667 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.667 05:45:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.233 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.233 05:45:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:02.491 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:02.491 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:02.749 [2024-07-26 05:45:17.452912] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.749 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:03.007 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.007 "name": "Existed_Raid", 00:17:03.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.007 "strip_size_kb": 64, 00:17:03.007 "state": "configuring", 00:17:03.007 "raid_level": "concat", 00:17:03.007 "superblock": false, 00:17:03.007 "num_base_bdevs": 3, 00:17:03.007 "num_base_bdevs_discovered": 2, 00:17:03.007 "num_base_bdevs_operational": 3, 00:17:03.007 "base_bdevs_list": [ 00:17:03.007 { 00:17:03.007 "name": "BaseBdev1", 00:17:03.007 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:03.007 "is_configured": true, 00:17:03.007 "data_offset": 0, 00:17:03.007 "data_size": 65536 00:17:03.007 }, 00:17:03.008 { 00:17:03.008 "name": null, 00:17:03.008 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:17:03.008 "is_configured": false, 00:17:03.008 "data_offset": 0, 00:17:03.008 "data_size": 65536 00:17:03.008 }, 00:17:03.008 { 00:17:03.008 "name": "BaseBdev3", 00:17:03.008 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:17:03.008 "is_configured": true, 00:17:03.008 "data_offset": 0, 00:17:03.008 "data_size": 65536 00:17:03.008 } 00:17:03.008 ] 00:17:03.008 }' 00:17:03.008 05:45:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.008 05:45:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.573 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:03.573 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:03.831 [2024-07-26 05:45:18.648066] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:03.831 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.089 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.089 "name": "Existed_Raid", 00:17:04.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.089 "strip_size_kb": 64, 00:17:04.089 "state": "configuring", 00:17:04.089 "raid_level": "concat", 00:17:04.089 "superblock": false, 00:17:04.089 "num_base_bdevs": 3, 00:17:04.089 "num_base_bdevs_discovered": 1, 00:17:04.089 "num_base_bdevs_operational": 3, 00:17:04.089 "base_bdevs_list": [ 00:17:04.089 { 00:17:04.089 "name": null, 00:17:04.089 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:04.089 "is_configured": false, 00:17:04.089 "data_offset": 0, 00:17:04.089 "data_size": 65536 00:17:04.089 }, 00:17:04.089 { 00:17:04.089 "name": null, 00:17:04.089 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:17:04.089 "is_configured": false, 00:17:04.089 "data_offset": 0, 00:17:04.089 "data_size": 65536 00:17:04.089 }, 00:17:04.089 { 00:17:04.089 "name": "BaseBdev3", 00:17:04.089 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:17:04.089 "is_configured": true, 00:17:04.089 "data_offset": 0, 00:17:04.089 "data_size": 65536 00:17:04.089 } 00:17:04.089 ] 00:17:04.089 }' 00:17:04.089 05:45:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.089 05:45:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.654 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.654 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:04.913 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:04.913 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:05.171 [2024-07-26 05:45:19.966007] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.171 05:45:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:05.429 05:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.429 "name": "Existed_Raid", 00:17:05.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.429 "strip_size_kb": 64, 00:17:05.429 "state": "configuring", 00:17:05.429 "raid_level": "concat", 00:17:05.429 "superblock": false, 00:17:05.429 "num_base_bdevs": 3, 00:17:05.429 "num_base_bdevs_discovered": 2, 00:17:05.429 "num_base_bdevs_operational": 3, 00:17:05.429 "base_bdevs_list": [ 00:17:05.429 { 00:17:05.429 "name": null, 00:17:05.429 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:05.429 "is_configured": false, 00:17:05.429 "data_offset": 0, 00:17:05.429 "data_size": 65536 00:17:05.429 }, 00:17:05.429 { 00:17:05.429 "name": "BaseBdev2", 00:17:05.429 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:17:05.429 "is_configured": true, 00:17:05.429 "data_offset": 0, 00:17:05.429 "data_size": 65536 00:17:05.429 }, 00:17:05.429 { 00:17:05.429 "name": "BaseBdev3", 00:17:05.429 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:17:05.430 "is_configured": true, 00:17:05.430 "data_offset": 0, 00:17:05.430 "data_size": 65536 00:17:05.430 } 00:17:05.430 ] 00:17:05.430 }' 00:17:05.430 05:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.430 05:45:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.995 05:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.995 05:45:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:06.254 05:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:06.254 05:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:06.254 05:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.513 05:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 85ceae7e-a9cc-442f-9d13-967631d2d83c 00:17:06.773 [2024-07-26 05:45:21.481517] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:06.773 [2024-07-26 05:45:21.481555] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24c5450 00:17:06.773 [2024-07-26 05:45:21.481564] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:17:06.773 [2024-07-26 05:45:21.481765] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24c6ed0 00:17:06.773 [2024-07-26 05:45:21.481880] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24c5450 00:17:06.773 [2024-07-26 05:45:21.481891] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24c5450 00:17:06.773 [2024-07-26 05:45:21.482055] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:06.773 NewBaseBdev 00:17:06.773 05:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:06.773 05:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:06.773 05:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:06.773 05:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:06.773 05:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:06.773 05:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:06.773 05:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:07.032 05:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:07.292 [ 00:17:07.292 { 00:17:07.292 "name": "NewBaseBdev", 00:17:07.292 "aliases": [ 00:17:07.292 "85ceae7e-a9cc-442f-9d13-967631d2d83c" 00:17:07.292 ], 00:17:07.292 "product_name": "Malloc disk", 00:17:07.292 "block_size": 512, 00:17:07.292 "num_blocks": 65536, 00:17:07.292 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:07.292 "assigned_rate_limits": { 00:17:07.292 "rw_ios_per_sec": 0, 00:17:07.292 "rw_mbytes_per_sec": 0, 00:17:07.292 "r_mbytes_per_sec": 0, 00:17:07.292 "w_mbytes_per_sec": 0 00:17:07.292 }, 00:17:07.292 "claimed": true, 00:17:07.292 "claim_type": "exclusive_write", 00:17:07.292 "zoned": false, 00:17:07.292 "supported_io_types": { 00:17:07.292 "read": true, 00:17:07.292 "write": true, 00:17:07.292 "unmap": true, 00:17:07.292 "flush": true, 00:17:07.292 "reset": true, 00:17:07.292 "nvme_admin": false, 00:17:07.292 "nvme_io": false, 00:17:07.292 "nvme_io_md": false, 00:17:07.292 "write_zeroes": true, 00:17:07.292 "zcopy": true, 00:17:07.292 "get_zone_info": false, 00:17:07.292 "zone_management": false, 00:17:07.292 "zone_append": false, 00:17:07.292 "compare": false, 00:17:07.292 "compare_and_write": false, 00:17:07.292 "abort": true, 00:17:07.292 "seek_hole": false, 00:17:07.292 "seek_data": false, 00:17:07.292 "copy": true, 00:17:07.292 "nvme_iov_md": false 00:17:07.292 }, 00:17:07.292 "memory_domains": [ 00:17:07.292 { 00:17:07.292 "dma_device_id": "system", 00:17:07.292 "dma_device_type": 1 00:17:07.292 }, 00:17:07.292 { 00:17:07.292 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.292 "dma_device_type": 2 00:17:07.292 } 00:17:07.292 ], 00:17:07.292 "driver_specific": {} 00:17:07.292 } 00:17:07.292 ] 00:17:07.292 05:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:07.292 05:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:17:07.292 05:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:07.292 "name": "Existed_Raid", 00:17:07.292 "uuid": "754f9ef3-c269-4ed7-8442-b2d045d38dff", 00:17:07.292 "strip_size_kb": 64, 00:17:07.292 "state": "online", 00:17:07.292 "raid_level": "concat", 00:17:07.292 "superblock": false, 00:17:07.292 "num_base_bdevs": 3, 00:17:07.292 "num_base_bdevs_discovered": 3, 00:17:07.292 "num_base_bdevs_operational": 3, 00:17:07.292 "base_bdevs_list": [ 00:17:07.292 { 00:17:07.292 "name": "NewBaseBdev", 00:17:07.292 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:07.292 "is_configured": true, 00:17:07.292 "data_offset": 0, 00:17:07.292 "data_size": 65536 00:17:07.292 }, 00:17:07.292 { 00:17:07.292 "name": "BaseBdev2", 00:17:07.292 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:17:07.292 "is_configured": true, 00:17:07.292 "data_offset": 0, 00:17:07.292 "data_size": 65536 00:17:07.292 }, 00:17:07.292 { 00:17:07.292 "name": "BaseBdev3", 00:17:07.292 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:17:07.292 "is_configured": true, 00:17:07.292 "data_offset": 0, 00:17:07.292 "data_size": 65536 00:17:07.292 } 00:17:07.292 ] 00:17:07.292 }' 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:07.292 05:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.860 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:07.860 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:07.860 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:07.860 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:07.860 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:07.860 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:07.860 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:07.860 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:08.119 [2024-07-26 05:45:22.973790] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:08.119 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:08.119 "name": "Existed_Raid", 00:17:08.119 "aliases": [ 00:17:08.119 "754f9ef3-c269-4ed7-8442-b2d045d38dff" 00:17:08.119 ], 00:17:08.119 "product_name": "Raid Volume", 00:17:08.119 "block_size": 512, 00:17:08.119 "num_blocks": 196608, 00:17:08.119 "uuid": "754f9ef3-c269-4ed7-8442-b2d045d38dff", 00:17:08.119 "assigned_rate_limits": { 00:17:08.119 "rw_ios_per_sec": 0, 00:17:08.119 "rw_mbytes_per_sec": 0, 00:17:08.119 "r_mbytes_per_sec": 0, 00:17:08.119 "w_mbytes_per_sec": 0 00:17:08.119 }, 00:17:08.119 "claimed": false, 00:17:08.119 "zoned": false, 00:17:08.119 "supported_io_types": { 00:17:08.119 "read": true, 00:17:08.119 "write": true, 00:17:08.119 "unmap": true, 00:17:08.119 "flush": true, 00:17:08.119 "reset": true, 00:17:08.119 "nvme_admin": false, 00:17:08.119 "nvme_io": false, 00:17:08.119 "nvme_io_md": false, 00:17:08.119 "write_zeroes": true, 00:17:08.119 "zcopy": false, 00:17:08.119 "get_zone_info": false, 00:17:08.119 "zone_management": false, 00:17:08.119 "zone_append": false, 00:17:08.119 "compare": false, 00:17:08.119 "compare_and_write": false, 00:17:08.119 "abort": false, 00:17:08.119 "seek_hole": false, 00:17:08.119 "seek_data": false, 00:17:08.119 "copy": false, 00:17:08.119 "nvme_iov_md": false 00:17:08.119 }, 00:17:08.119 "memory_domains": [ 00:17:08.119 { 00:17:08.119 "dma_device_id": "system", 00:17:08.119 "dma_device_type": 1 00:17:08.119 }, 00:17:08.119 { 00:17:08.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.119 "dma_device_type": 2 00:17:08.119 }, 00:17:08.119 { 00:17:08.119 "dma_device_id": "system", 00:17:08.119 "dma_device_type": 1 00:17:08.119 }, 00:17:08.119 { 00:17:08.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.119 "dma_device_type": 2 00:17:08.119 }, 00:17:08.119 { 00:17:08.119 "dma_device_id": "system", 00:17:08.119 "dma_device_type": 1 00:17:08.119 }, 00:17:08.119 { 00:17:08.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.119 "dma_device_type": 2 00:17:08.119 } 00:17:08.119 ], 00:17:08.119 "driver_specific": { 00:17:08.119 "raid": { 00:17:08.119 "uuid": "754f9ef3-c269-4ed7-8442-b2d045d38dff", 00:17:08.119 "strip_size_kb": 64, 00:17:08.119 "state": "online", 00:17:08.119 "raid_level": "concat", 00:17:08.119 "superblock": false, 00:17:08.119 "num_base_bdevs": 3, 00:17:08.119 "num_base_bdevs_discovered": 3, 00:17:08.119 "num_base_bdevs_operational": 3, 00:17:08.119 "base_bdevs_list": [ 00:17:08.119 { 00:17:08.119 "name": "NewBaseBdev", 00:17:08.119 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:08.119 "is_configured": true, 00:17:08.119 "data_offset": 0, 00:17:08.119 "data_size": 65536 00:17:08.119 }, 00:17:08.119 { 00:17:08.119 "name": "BaseBdev2", 00:17:08.119 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:17:08.119 "is_configured": true, 00:17:08.119 "data_offset": 0, 00:17:08.119 "data_size": 65536 00:17:08.119 }, 00:17:08.119 { 00:17:08.119 "name": "BaseBdev3", 00:17:08.119 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:17:08.119 "is_configured": true, 00:17:08.120 "data_offset": 0, 00:17:08.120 "data_size": 65536 00:17:08.120 } 00:17:08.120 ] 00:17:08.120 } 00:17:08.120 } 00:17:08.120 }' 00:17:08.120 05:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:08.379 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:08.379 BaseBdev2 00:17:08.379 BaseBdev3' 00:17:08.379 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:08.379 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:08.379 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:08.654 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:08.654 "name": "NewBaseBdev", 00:17:08.654 "aliases": [ 00:17:08.654 "85ceae7e-a9cc-442f-9d13-967631d2d83c" 00:17:08.654 ], 00:17:08.654 "product_name": "Malloc disk", 00:17:08.654 "block_size": 512, 00:17:08.654 "num_blocks": 65536, 00:17:08.654 "uuid": "85ceae7e-a9cc-442f-9d13-967631d2d83c", 00:17:08.654 "assigned_rate_limits": { 00:17:08.654 "rw_ios_per_sec": 0, 00:17:08.654 "rw_mbytes_per_sec": 0, 00:17:08.654 "r_mbytes_per_sec": 0, 00:17:08.654 "w_mbytes_per_sec": 0 00:17:08.654 }, 00:17:08.654 "claimed": true, 00:17:08.654 "claim_type": "exclusive_write", 00:17:08.654 "zoned": false, 00:17:08.654 "supported_io_types": { 00:17:08.654 "read": true, 00:17:08.654 "write": true, 00:17:08.654 "unmap": true, 00:17:08.654 "flush": true, 00:17:08.654 "reset": true, 00:17:08.654 "nvme_admin": false, 00:17:08.654 "nvme_io": false, 00:17:08.654 "nvme_io_md": false, 00:17:08.654 "write_zeroes": true, 00:17:08.654 "zcopy": true, 00:17:08.654 "get_zone_info": false, 00:17:08.654 "zone_management": false, 00:17:08.654 "zone_append": false, 00:17:08.654 "compare": false, 00:17:08.654 "compare_and_write": false, 00:17:08.654 "abort": true, 00:17:08.654 "seek_hole": false, 00:17:08.654 "seek_data": false, 00:17:08.654 "copy": true, 00:17:08.654 "nvme_iov_md": false 00:17:08.654 }, 00:17:08.654 "memory_domains": [ 00:17:08.654 { 00:17:08.654 "dma_device_id": "system", 00:17:08.654 "dma_device_type": 1 00:17:08.654 }, 00:17:08.654 { 00:17:08.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.654 "dma_device_type": 2 00:17:08.654 } 00:17:08.654 ], 00:17:08.654 "driver_specific": {} 00:17:08.654 }' 00:17:08.654 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.654 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.654 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:08.654 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.654 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.654 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.654 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.654 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.915 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.915 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.915 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.915 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.915 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:08.915 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:08.915 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:09.174 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:09.174 "name": "BaseBdev2", 00:17:09.174 "aliases": [ 00:17:09.174 "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc" 00:17:09.174 ], 00:17:09.174 "product_name": "Malloc disk", 00:17:09.174 "block_size": 512, 00:17:09.174 "num_blocks": 65536, 00:17:09.174 "uuid": "1eae77d6-2e5b-4c1b-9091-0a2aec88a9dc", 00:17:09.174 "assigned_rate_limits": { 00:17:09.174 "rw_ios_per_sec": 0, 00:17:09.174 "rw_mbytes_per_sec": 0, 00:17:09.174 "r_mbytes_per_sec": 0, 00:17:09.174 "w_mbytes_per_sec": 0 00:17:09.174 }, 00:17:09.174 "claimed": true, 00:17:09.174 "claim_type": "exclusive_write", 00:17:09.174 "zoned": false, 00:17:09.174 "supported_io_types": { 00:17:09.174 "read": true, 00:17:09.174 "write": true, 00:17:09.174 "unmap": true, 00:17:09.174 "flush": true, 00:17:09.174 "reset": true, 00:17:09.174 "nvme_admin": false, 00:17:09.174 "nvme_io": false, 00:17:09.174 "nvme_io_md": false, 00:17:09.174 "write_zeroes": true, 00:17:09.174 "zcopy": true, 00:17:09.174 "get_zone_info": false, 00:17:09.174 "zone_management": false, 00:17:09.174 "zone_append": false, 00:17:09.174 "compare": false, 00:17:09.174 "compare_and_write": false, 00:17:09.174 "abort": true, 00:17:09.174 "seek_hole": false, 00:17:09.174 "seek_data": false, 00:17:09.174 "copy": true, 00:17:09.174 "nvme_iov_md": false 00:17:09.174 }, 00:17:09.174 "memory_domains": [ 00:17:09.174 { 00:17:09.174 "dma_device_id": "system", 00:17:09.174 "dma_device_type": 1 00:17:09.174 }, 00:17:09.174 { 00:17:09.174 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.174 "dma_device_type": 2 00:17:09.174 } 00:17:09.174 ], 00:17:09.174 "driver_specific": {} 00:17:09.174 }' 00:17:09.174 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.174 05:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.174 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:09.174 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:09.433 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:09.693 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:09.693 "name": "BaseBdev3", 00:17:09.693 "aliases": [ 00:17:09.693 "b41512ee-f0ba-4832-a0d2-57edd880c658" 00:17:09.693 ], 00:17:09.693 "product_name": "Malloc disk", 00:17:09.693 "block_size": 512, 00:17:09.693 "num_blocks": 65536, 00:17:09.693 "uuid": "b41512ee-f0ba-4832-a0d2-57edd880c658", 00:17:09.693 "assigned_rate_limits": { 00:17:09.693 "rw_ios_per_sec": 0, 00:17:09.693 "rw_mbytes_per_sec": 0, 00:17:09.693 "r_mbytes_per_sec": 0, 00:17:09.693 "w_mbytes_per_sec": 0 00:17:09.693 }, 00:17:09.693 "claimed": true, 00:17:09.693 "claim_type": "exclusive_write", 00:17:09.693 "zoned": false, 00:17:09.693 "supported_io_types": { 00:17:09.693 "read": true, 00:17:09.693 "write": true, 00:17:09.693 "unmap": true, 00:17:09.693 "flush": true, 00:17:09.693 "reset": true, 00:17:09.693 "nvme_admin": false, 00:17:09.693 "nvme_io": false, 00:17:09.693 "nvme_io_md": false, 00:17:09.693 "write_zeroes": true, 00:17:09.693 "zcopy": true, 00:17:09.693 "get_zone_info": false, 00:17:09.693 "zone_management": false, 00:17:09.693 "zone_append": false, 00:17:09.693 "compare": false, 00:17:09.693 "compare_and_write": false, 00:17:09.693 "abort": true, 00:17:09.693 "seek_hole": false, 00:17:09.693 "seek_data": false, 00:17:09.693 "copy": true, 00:17:09.693 "nvme_iov_md": false 00:17:09.693 }, 00:17:09.693 "memory_domains": [ 00:17:09.693 { 00:17:09.693 "dma_device_id": "system", 00:17:09.693 "dma_device_type": 1 00:17:09.693 }, 00:17:09.693 { 00:17:09.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.693 "dma_device_type": 2 00:17:09.693 } 00:17:09.693 ], 00:17:09.693 "driver_specific": {} 00:17:09.693 }' 00:17:09.693 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.693 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.693 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:09.693 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.952 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.952 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:09.952 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.952 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.952 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:09.952 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.952 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:10.211 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:10.211 05:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:10.211 [2024-07-26 05:45:25.091119] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:10.211 [2024-07-26 05:45:25.091152] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:10.211 [2024-07-26 05:45:25.091210] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:10.211 [2024-07-26 05:45:25.091264] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:10.211 [2024-07-26 05:45:25.091275] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24c5450 name Existed_Raid, state offline 00:17:10.211 05:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1160454 00:17:10.211 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1160454 ']' 00:17:10.211 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1160454 00:17:10.211 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:10.211 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:10.211 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1160454 00:17:10.470 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:10.470 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:10.470 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1160454' 00:17:10.470 killing process with pid 1160454 00:17:10.470 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1160454 00:17:10.470 [2024-07-26 05:45:25.160330] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:10.470 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1160454 00:17:10.470 [2024-07-26 05:45:25.190037] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:10.730 00:17:10.730 real 0m28.704s 00:17:10.730 user 0m52.823s 00:17:10.730 sys 0m5.049s 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.730 ************************************ 00:17:10.730 END TEST raid_state_function_test 00:17:10.730 ************************************ 00:17:10.730 05:45:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:10.730 05:45:25 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:17:10.730 05:45:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:10.730 05:45:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:10.730 05:45:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:10.730 ************************************ 00:17:10.730 START TEST raid_state_function_test_sb 00:17:10.730 ************************************ 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1165251 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1165251' 00:17:10.730 Process raid pid: 1165251 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1165251 /var/tmp/spdk-raid.sock 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1165251 ']' 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:10.730 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:10.730 05:45:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:10.730 [2024-07-26 05:45:25.576608] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:17:10.730 [2024-07-26 05:45:25.576685] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:10.989 [2024-07-26 05:45:25.707242] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:10.989 [2024-07-26 05:45:25.809421] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:10.989 [2024-07-26 05:45:25.872261] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:10.989 [2024-07-26 05:45:25.872288] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:11.925 05:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:11.925 05:45:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:11.925 05:45:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:12.183 [2024-07-26 05:45:26.997528] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:12.183 [2024-07-26 05:45:26.997570] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:12.183 [2024-07-26 05:45:26.997581] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:12.183 [2024-07-26 05:45:26.997593] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:12.183 [2024-07-26 05:45:26.997601] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:12.183 [2024-07-26 05:45:26.997613] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.183 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.442 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.442 "name": "Existed_Raid", 00:17:12.442 "uuid": "a934f268-a95a-4786-8d98-a47431f47d8b", 00:17:12.442 "strip_size_kb": 64, 00:17:12.442 "state": "configuring", 00:17:12.442 "raid_level": "concat", 00:17:12.442 "superblock": true, 00:17:12.442 "num_base_bdevs": 3, 00:17:12.442 "num_base_bdevs_discovered": 0, 00:17:12.442 "num_base_bdevs_operational": 3, 00:17:12.442 "base_bdevs_list": [ 00:17:12.442 { 00:17:12.442 "name": "BaseBdev1", 00:17:12.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.442 "is_configured": false, 00:17:12.442 "data_offset": 0, 00:17:12.442 "data_size": 0 00:17:12.442 }, 00:17:12.442 { 00:17:12.442 "name": "BaseBdev2", 00:17:12.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.442 "is_configured": false, 00:17:12.442 "data_offset": 0, 00:17:12.442 "data_size": 0 00:17:12.442 }, 00:17:12.442 { 00:17:12.442 "name": "BaseBdev3", 00:17:12.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.442 "is_configured": false, 00:17:12.442 "data_offset": 0, 00:17:12.442 "data_size": 0 00:17:12.442 } 00:17:12.442 ] 00:17:12.442 }' 00:17:12.442 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.442 05:45:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:13.008 05:45:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:13.266 [2024-07-26 05:45:28.104302] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:13.266 [2024-07-26 05:45:28.104336] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1075a80 name Existed_Raid, state configuring 00:17:13.266 05:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:13.524 [2024-07-26 05:45:28.348966] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:13.525 [2024-07-26 05:45:28.348993] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:13.525 [2024-07-26 05:45:28.349002] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:13.525 [2024-07-26 05:45:28.349014] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:13.525 [2024-07-26 05:45:28.349022] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:13.525 [2024-07-26 05:45:28.349033] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:13.525 05:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:13.783 [2024-07-26 05:45:28.603557] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:13.783 BaseBdev1 00:17:13.783 05:45:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:13.783 05:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:13.783 05:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:13.783 05:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:13.783 05:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:13.783 05:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:13.783 05:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:14.041 05:45:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:14.300 [ 00:17:14.300 { 00:17:14.300 "name": "BaseBdev1", 00:17:14.300 "aliases": [ 00:17:14.300 "5ed0c555-b16b-41be-8342-21ad2b20071f" 00:17:14.300 ], 00:17:14.300 "product_name": "Malloc disk", 00:17:14.300 "block_size": 512, 00:17:14.300 "num_blocks": 65536, 00:17:14.300 "uuid": "5ed0c555-b16b-41be-8342-21ad2b20071f", 00:17:14.300 "assigned_rate_limits": { 00:17:14.300 "rw_ios_per_sec": 0, 00:17:14.300 "rw_mbytes_per_sec": 0, 00:17:14.300 "r_mbytes_per_sec": 0, 00:17:14.300 "w_mbytes_per_sec": 0 00:17:14.300 }, 00:17:14.300 "claimed": true, 00:17:14.300 "claim_type": "exclusive_write", 00:17:14.300 "zoned": false, 00:17:14.300 "supported_io_types": { 00:17:14.300 "read": true, 00:17:14.300 "write": true, 00:17:14.300 "unmap": true, 00:17:14.300 "flush": true, 00:17:14.300 "reset": true, 00:17:14.300 "nvme_admin": false, 00:17:14.300 "nvme_io": false, 00:17:14.300 "nvme_io_md": false, 00:17:14.300 "write_zeroes": true, 00:17:14.300 "zcopy": true, 00:17:14.300 "get_zone_info": false, 00:17:14.300 "zone_management": false, 00:17:14.300 "zone_append": false, 00:17:14.300 "compare": false, 00:17:14.300 "compare_and_write": false, 00:17:14.300 "abort": true, 00:17:14.300 "seek_hole": false, 00:17:14.300 "seek_data": false, 00:17:14.300 "copy": true, 00:17:14.300 "nvme_iov_md": false 00:17:14.300 }, 00:17:14.300 "memory_domains": [ 00:17:14.300 { 00:17:14.300 "dma_device_id": "system", 00:17:14.300 "dma_device_type": 1 00:17:14.300 }, 00:17:14.300 { 00:17:14.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.300 "dma_device_type": 2 00:17:14.300 } 00:17:14.300 ], 00:17:14.300 "driver_specific": {} 00:17:14.300 } 00:17:14.300 ] 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.300 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:14.558 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.558 "name": "Existed_Raid", 00:17:14.558 "uuid": "e340f084-f87d-4e66-8ffb-e66885c62774", 00:17:14.558 "strip_size_kb": 64, 00:17:14.558 "state": "configuring", 00:17:14.558 "raid_level": "concat", 00:17:14.558 "superblock": true, 00:17:14.558 "num_base_bdevs": 3, 00:17:14.558 "num_base_bdevs_discovered": 1, 00:17:14.558 "num_base_bdevs_operational": 3, 00:17:14.558 "base_bdevs_list": [ 00:17:14.558 { 00:17:14.558 "name": "BaseBdev1", 00:17:14.558 "uuid": "5ed0c555-b16b-41be-8342-21ad2b20071f", 00:17:14.558 "is_configured": true, 00:17:14.558 "data_offset": 2048, 00:17:14.558 "data_size": 63488 00:17:14.558 }, 00:17:14.558 { 00:17:14.558 "name": "BaseBdev2", 00:17:14.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:14.558 "is_configured": false, 00:17:14.558 "data_offset": 0, 00:17:14.558 "data_size": 0 00:17:14.558 }, 00:17:14.558 { 00:17:14.558 "name": "BaseBdev3", 00:17:14.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:14.558 "is_configured": false, 00:17:14.558 "data_offset": 0, 00:17:14.558 "data_size": 0 00:17:14.558 } 00:17:14.558 ] 00:17:14.558 }' 00:17:14.558 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.558 05:45:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:15.160 05:45:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:15.423 [2024-07-26 05:45:30.171724] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:15.423 [2024-07-26 05:45:30.171770] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1075310 name Existed_Raid, state configuring 00:17:15.423 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:15.681 [2024-07-26 05:45:30.420425] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:15.681 [2024-07-26 05:45:30.421876] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:15.681 [2024-07-26 05:45:30.421908] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:15.681 [2024-07-26 05:45:30.421918] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:15.682 [2024-07-26 05:45:30.421929] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.682 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.940 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.940 "name": "Existed_Raid", 00:17:15.940 "uuid": "9c21a187-02e8-480b-b243-fe7926fa5f1c", 00:17:15.940 "strip_size_kb": 64, 00:17:15.940 "state": "configuring", 00:17:15.940 "raid_level": "concat", 00:17:15.940 "superblock": true, 00:17:15.940 "num_base_bdevs": 3, 00:17:15.940 "num_base_bdevs_discovered": 1, 00:17:15.940 "num_base_bdevs_operational": 3, 00:17:15.940 "base_bdevs_list": [ 00:17:15.940 { 00:17:15.940 "name": "BaseBdev1", 00:17:15.940 "uuid": "5ed0c555-b16b-41be-8342-21ad2b20071f", 00:17:15.940 "is_configured": true, 00:17:15.940 "data_offset": 2048, 00:17:15.940 "data_size": 63488 00:17:15.940 }, 00:17:15.940 { 00:17:15.940 "name": "BaseBdev2", 00:17:15.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.940 "is_configured": false, 00:17:15.940 "data_offset": 0, 00:17:15.940 "data_size": 0 00:17:15.940 }, 00:17:15.940 { 00:17:15.940 "name": "BaseBdev3", 00:17:15.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.940 "is_configured": false, 00:17:15.940 "data_offset": 0, 00:17:15.940 "data_size": 0 00:17:15.940 } 00:17:15.940 ] 00:17:15.940 }' 00:17:15.940 05:45:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.940 05:45:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:16.505 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:16.763 [2024-07-26 05:45:31.534859] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:16.763 BaseBdev2 00:17:16.763 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:16.763 05:45:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:16.763 05:45:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:16.763 05:45:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:16.763 05:45:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:16.763 05:45:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:16.763 05:45:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:17.021 [ 00:17:17.021 { 00:17:17.021 "name": "BaseBdev2", 00:17:17.021 "aliases": [ 00:17:17.021 "fc9be932-fdf7-40b1-a4fd-20eb3d0a67ee" 00:17:17.021 ], 00:17:17.021 "product_name": "Malloc disk", 00:17:17.021 "block_size": 512, 00:17:17.021 "num_blocks": 65536, 00:17:17.021 "uuid": "fc9be932-fdf7-40b1-a4fd-20eb3d0a67ee", 00:17:17.021 "assigned_rate_limits": { 00:17:17.021 "rw_ios_per_sec": 0, 00:17:17.021 "rw_mbytes_per_sec": 0, 00:17:17.021 "r_mbytes_per_sec": 0, 00:17:17.021 "w_mbytes_per_sec": 0 00:17:17.021 }, 00:17:17.021 "claimed": true, 00:17:17.021 "claim_type": "exclusive_write", 00:17:17.021 "zoned": false, 00:17:17.021 "supported_io_types": { 00:17:17.021 "read": true, 00:17:17.021 "write": true, 00:17:17.021 "unmap": true, 00:17:17.021 "flush": true, 00:17:17.021 "reset": true, 00:17:17.021 "nvme_admin": false, 00:17:17.021 "nvme_io": false, 00:17:17.021 "nvme_io_md": false, 00:17:17.021 "write_zeroes": true, 00:17:17.021 "zcopy": true, 00:17:17.021 "get_zone_info": false, 00:17:17.021 "zone_management": false, 00:17:17.021 "zone_append": false, 00:17:17.021 "compare": false, 00:17:17.021 "compare_and_write": false, 00:17:17.021 "abort": true, 00:17:17.021 "seek_hole": false, 00:17:17.021 "seek_data": false, 00:17:17.021 "copy": true, 00:17:17.021 "nvme_iov_md": false 00:17:17.021 }, 00:17:17.021 "memory_domains": [ 00:17:17.021 { 00:17:17.021 "dma_device_id": "system", 00:17:17.021 "dma_device_type": 1 00:17:17.021 }, 00:17:17.021 { 00:17:17.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:17.021 "dma_device_type": 2 00:17:17.021 } 00:17:17.021 ], 00:17:17.021 "driver_specific": {} 00:17:17.021 } 00:17:17.021 ] 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:17.021 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.022 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.022 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.022 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.022 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.022 05:45:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:17.279 05:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.279 "name": "Existed_Raid", 00:17:17.279 "uuid": "9c21a187-02e8-480b-b243-fe7926fa5f1c", 00:17:17.279 "strip_size_kb": 64, 00:17:17.279 "state": "configuring", 00:17:17.279 "raid_level": "concat", 00:17:17.279 "superblock": true, 00:17:17.279 "num_base_bdevs": 3, 00:17:17.279 "num_base_bdevs_discovered": 2, 00:17:17.279 "num_base_bdevs_operational": 3, 00:17:17.279 "base_bdevs_list": [ 00:17:17.279 { 00:17:17.279 "name": "BaseBdev1", 00:17:17.279 "uuid": "5ed0c555-b16b-41be-8342-21ad2b20071f", 00:17:17.279 "is_configured": true, 00:17:17.279 "data_offset": 2048, 00:17:17.279 "data_size": 63488 00:17:17.279 }, 00:17:17.279 { 00:17:17.280 "name": "BaseBdev2", 00:17:17.280 "uuid": "fc9be932-fdf7-40b1-a4fd-20eb3d0a67ee", 00:17:17.280 "is_configured": true, 00:17:17.280 "data_offset": 2048, 00:17:17.280 "data_size": 63488 00:17:17.280 }, 00:17:17.280 { 00:17:17.280 "name": "BaseBdev3", 00:17:17.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:17.280 "is_configured": false, 00:17:17.280 "data_offset": 0, 00:17:17.280 "data_size": 0 00:17:17.280 } 00:17:17.280 ] 00:17:17.280 }' 00:17:17.280 05:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.280 05:45:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:18.213 05:45:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:18.778 [2024-07-26 05:45:33.468408] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:18.778 [2024-07-26 05:45:33.468589] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1076400 00:17:18.778 [2024-07-26 05:45:33.468603] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:18.778 [2024-07-26 05:45:33.468785] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1075ef0 00:17:18.778 [2024-07-26 05:45:33.468904] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1076400 00:17:18.778 [2024-07-26 05:45:33.468914] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1076400 00:17:18.778 [2024-07-26 05:45:33.469005] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:18.778 BaseBdev3 00:17:18.778 05:45:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:18.778 05:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:18.778 05:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:18.778 05:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:18.778 05:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:18.778 05:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:18.778 05:45:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.344 05:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:19.344 [ 00:17:19.344 { 00:17:19.344 "name": "BaseBdev3", 00:17:19.344 "aliases": [ 00:17:19.344 "2691aec3-4893-47e9-978d-8dc1d3bc2afa" 00:17:19.344 ], 00:17:19.344 "product_name": "Malloc disk", 00:17:19.344 "block_size": 512, 00:17:19.344 "num_blocks": 65536, 00:17:19.344 "uuid": "2691aec3-4893-47e9-978d-8dc1d3bc2afa", 00:17:19.344 "assigned_rate_limits": { 00:17:19.344 "rw_ios_per_sec": 0, 00:17:19.344 "rw_mbytes_per_sec": 0, 00:17:19.344 "r_mbytes_per_sec": 0, 00:17:19.344 "w_mbytes_per_sec": 0 00:17:19.344 }, 00:17:19.344 "claimed": true, 00:17:19.344 "claim_type": "exclusive_write", 00:17:19.344 "zoned": false, 00:17:19.344 "supported_io_types": { 00:17:19.344 "read": true, 00:17:19.344 "write": true, 00:17:19.344 "unmap": true, 00:17:19.344 "flush": true, 00:17:19.344 "reset": true, 00:17:19.344 "nvme_admin": false, 00:17:19.344 "nvme_io": false, 00:17:19.344 "nvme_io_md": false, 00:17:19.344 "write_zeroes": true, 00:17:19.344 "zcopy": true, 00:17:19.344 "get_zone_info": false, 00:17:19.344 "zone_management": false, 00:17:19.344 "zone_append": false, 00:17:19.344 "compare": false, 00:17:19.344 "compare_and_write": false, 00:17:19.344 "abort": true, 00:17:19.344 "seek_hole": false, 00:17:19.344 "seek_data": false, 00:17:19.344 "copy": true, 00:17:19.344 "nvme_iov_md": false 00:17:19.344 }, 00:17:19.344 "memory_domains": [ 00:17:19.344 { 00:17:19.344 "dma_device_id": "system", 00:17:19.344 "dma_device_type": 1 00:17:19.344 }, 00:17:19.344 { 00:17:19.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.344 "dma_device_type": 2 00:17:19.344 } 00:17:19.344 ], 00:17:19.344 "driver_specific": {} 00:17:19.344 } 00:17:19.344 ] 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.602 "name": "Existed_Raid", 00:17:19.602 "uuid": "9c21a187-02e8-480b-b243-fe7926fa5f1c", 00:17:19.602 "strip_size_kb": 64, 00:17:19.602 "state": "online", 00:17:19.602 "raid_level": "concat", 00:17:19.602 "superblock": true, 00:17:19.602 "num_base_bdevs": 3, 00:17:19.602 "num_base_bdevs_discovered": 3, 00:17:19.602 "num_base_bdevs_operational": 3, 00:17:19.602 "base_bdevs_list": [ 00:17:19.602 { 00:17:19.602 "name": "BaseBdev1", 00:17:19.602 "uuid": "5ed0c555-b16b-41be-8342-21ad2b20071f", 00:17:19.602 "is_configured": true, 00:17:19.602 "data_offset": 2048, 00:17:19.602 "data_size": 63488 00:17:19.602 }, 00:17:19.602 { 00:17:19.602 "name": "BaseBdev2", 00:17:19.602 "uuid": "fc9be932-fdf7-40b1-a4fd-20eb3d0a67ee", 00:17:19.602 "is_configured": true, 00:17:19.602 "data_offset": 2048, 00:17:19.602 "data_size": 63488 00:17:19.602 }, 00:17:19.602 { 00:17:19.602 "name": "BaseBdev3", 00:17:19.602 "uuid": "2691aec3-4893-47e9-978d-8dc1d3bc2afa", 00:17:19.602 "is_configured": true, 00:17:19.602 "data_offset": 2048, 00:17:19.602 "data_size": 63488 00:17:19.602 } 00:17:19.602 ] 00:17:19.602 }' 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.602 05:45:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:20.168 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:20.168 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:20.168 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:20.168 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:20.168 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:20.168 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:20.168 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:20.168 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:20.426 [2024-07-26 05:45:35.209339] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:20.426 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:20.426 "name": "Existed_Raid", 00:17:20.426 "aliases": [ 00:17:20.426 "9c21a187-02e8-480b-b243-fe7926fa5f1c" 00:17:20.426 ], 00:17:20.426 "product_name": "Raid Volume", 00:17:20.426 "block_size": 512, 00:17:20.426 "num_blocks": 190464, 00:17:20.426 "uuid": "9c21a187-02e8-480b-b243-fe7926fa5f1c", 00:17:20.426 "assigned_rate_limits": { 00:17:20.426 "rw_ios_per_sec": 0, 00:17:20.426 "rw_mbytes_per_sec": 0, 00:17:20.426 "r_mbytes_per_sec": 0, 00:17:20.426 "w_mbytes_per_sec": 0 00:17:20.426 }, 00:17:20.426 "claimed": false, 00:17:20.426 "zoned": false, 00:17:20.426 "supported_io_types": { 00:17:20.426 "read": true, 00:17:20.426 "write": true, 00:17:20.426 "unmap": true, 00:17:20.426 "flush": true, 00:17:20.426 "reset": true, 00:17:20.426 "nvme_admin": false, 00:17:20.426 "nvme_io": false, 00:17:20.426 "nvme_io_md": false, 00:17:20.426 "write_zeroes": true, 00:17:20.426 "zcopy": false, 00:17:20.426 "get_zone_info": false, 00:17:20.426 "zone_management": false, 00:17:20.426 "zone_append": false, 00:17:20.426 "compare": false, 00:17:20.426 "compare_and_write": false, 00:17:20.426 "abort": false, 00:17:20.426 "seek_hole": false, 00:17:20.426 "seek_data": false, 00:17:20.426 "copy": false, 00:17:20.426 "nvme_iov_md": false 00:17:20.426 }, 00:17:20.426 "memory_domains": [ 00:17:20.426 { 00:17:20.426 "dma_device_id": "system", 00:17:20.426 "dma_device_type": 1 00:17:20.426 }, 00:17:20.426 { 00:17:20.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.426 "dma_device_type": 2 00:17:20.426 }, 00:17:20.426 { 00:17:20.426 "dma_device_id": "system", 00:17:20.426 "dma_device_type": 1 00:17:20.426 }, 00:17:20.426 { 00:17:20.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.426 "dma_device_type": 2 00:17:20.426 }, 00:17:20.426 { 00:17:20.426 "dma_device_id": "system", 00:17:20.426 "dma_device_type": 1 00:17:20.426 }, 00:17:20.426 { 00:17:20.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.426 "dma_device_type": 2 00:17:20.426 } 00:17:20.426 ], 00:17:20.426 "driver_specific": { 00:17:20.426 "raid": { 00:17:20.426 "uuid": "9c21a187-02e8-480b-b243-fe7926fa5f1c", 00:17:20.426 "strip_size_kb": 64, 00:17:20.426 "state": "online", 00:17:20.426 "raid_level": "concat", 00:17:20.426 "superblock": true, 00:17:20.426 "num_base_bdevs": 3, 00:17:20.426 "num_base_bdevs_discovered": 3, 00:17:20.426 "num_base_bdevs_operational": 3, 00:17:20.426 "base_bdevs_list": [ 00:17:20.426 { 00:17:20.426 "name": "BaseBdev1", 00:17:20.426 "uuid": "5ed0c555-b16b-41be-8342-21ad2b20071f", 00:17:20.426 "is_configured": true, 00:17:20.426 "data_offset": 2048, 00:17:20.426 "data_size": 63488 00:17:20.426 }, 00:17:20.426 { 00:17:20.426 "name": "BaseBdev2", 00:17:20.426 "uuid": "fc9be932-fdf7-40b1-a4fd-20eb3d0a67ee", 00:17:20.426 "is_configured": true, 00:17:20.426 "data_offset": 2048, 00:17:20.426 "data_size": 63488 00:17:20.426 }, 00:17:20.426 { 00:17:20.426 "name": "BaseBdev3", 00:17:20.426 "uuid": "2691aec3-4893-47e9-978d-8dc1d3bc2afa", 00:17:20.426 "is_configured": true, 00:17:20.426 "data_offset": 2048, 00:17:20.426 "data_size": 63488 00:17:20.426 } 00:17:20.426 ] 00:17:20.426 } 00:17:20.426 } 00:17:20.426 }' 00:17:20.426 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:20.426 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:20.426 BaseBdev2 00:17:20.426 BaseBdev3' 00:17:20.426 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:20.426 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:20.426 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:20.684 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:20.684 "name": "BaseBdev1", 00:17:20.684 "aliases": [ 00:17:20.684 "5ed0c555-b16b-41be-8342-21ad2b20071f" 00:17:20.684 ], 00:17:20.684 "product_name": "Malloc disk", 00:17:20.684 "block_size": 512, 00:17:20.684 "num_blocks": 65536, 00:17:20.684 "uuid": "5ed0c555-b16b-41be-8342-21ad2b20071f", 00:17:20.684 "assigned_rate_limits": { 00:17:20.684 "rw_ios_per_sec": 0, 00:17:20.684 "rw_mbytes_per_sec": 0, 00:17:20.684 "r_mbytes_per_sec": 0, 00:17:20.684 "w_mbytes_per_sec": 0 00:17:20.684 }, 00:17:20.684 "claimed": true, 00:17:20.684 "claim_type": "exclusive_write", 00:17:20.684 "zoned": false, 00:17:20.684 "supported_io_types": { 00:17:20.684 "read": true, 00:17:20.684 "write": true, 00:17:20.684 "unmap": true, 00:17:20.684 "flush": true, 00:17:20.684 "reset": true, 00:17:20.684 "nvme_admin": false, 00:17:20.684 "nvme_io": false, 00:17:20.684 "nvme_io_md": false, 00:17:20.684 "write_zeroes": true, 00:17:20.684 "zcopy": true, 00:17:20.684 "get_zone_info": false, 00:17:20.684 "zone_management": false, 00:17:20.685 "zone_append": false, 00:17:20.685 "compare": false, 00:17:20.685 "compare_and_write": false, 00:17:20.685 "abort": true, 00:17:20.685 "seek_hole": false, 00:17:20.685 "seek_data": false, 00:17:20.685 "copy": true, 00:17:20.685 "nvme_iov_md": false 00:17:20.685 }, 00:17:20.685 "memory_domains": [ 00:17:20.685 { 00:17:20.685 "dma_device_id": "system", 00:17:20.685 "dma_device_type": 1 00:17:20.685 }, 00:17:20.685 { 00:17:20.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.685 "dma_device_type": 2 00:17:20.685 } 00:17:20.685 ], 00:17:20.685 "driver_specific": {} 00:17:20.685 }' 00:17:20.685 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:20.685 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:20.943 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:20.943 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:20.943 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:20.943 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:20.943 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:20.943 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:20.943 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:20.943 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:20.943 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.201 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:21.201 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:21.201 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:21.201 05:45:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:21.459 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:21.459 "name": "BaseBdev2", 00:17:21.459 "aliases": [ 00:17:21.459 "fc9be932-fdf7-40b1-a4fd-20eb3d0a67ee" 00:17:21.459 ], 00:17:21.459 "product_name": "Malloc disk", 00:17:21.459 "block_size": 512, 00:17:21.459 "num_blocks": 65536, 00:17:21.459 "uuid": "fc9be932-fdf7-40b1-a4fd-20eb3d0a67ee", 00:17:21.459 "assigned_rate_limits": { 00:17:21.459 "rw_ios_per_sec": 0, 00:17:21.459 "rw_mbytes_per_sec": 0, 00:17:21.459 "r_mbytes_per_sec": 0, 00:17:21.459 "w_mbytes_per_sec": 0 00:17:21.459 }, 00:17:21.459 "claimed": true, 00:17:21.459 "claim_type": "exclusive_write", 00:17:21.459 "zoned": false, 00:17:21.459 "supported_io_types": { 00:17:21.459 "read": true, 00:17:21.459 "write": true, 00:17:21.459 "unmap": true, 00:17:21.459 "flush": true, 00:17:21.459 "reset": true, 00:17:21.459 "nvme_admin": false, 00:17:21.459 "nvme_io": false, 00:17:21.459 "nvme_io_md": false, 00:17:21.459 "write_zeroes": true, 00:17:21.459 "zcopy": true, 00:17:21.459 "get_zone_info": false, 00:17:21.459 "zone_management": false, 00:17:21.459 "zone_append": false, 00:17:21.459 "compare": false, 00:17:21.459 "compare_and_write": false, 00:17:21.459 "abort": true, 00:17:21.459 "seek_hole": false, 00:17:21.459 "seek_data": false, 00:17:21.459 "copy": true, 00:17:21.459 "nvme_iov_md": false 00:17:21.459 }, 00:17:21.459 "memory_domains": [ 00:17:21.459 { 00:17:21.459 "dma_device_id": "system", 00:17:21.459 "dma_device_type": 1 00:17:21.459 }, 00:17:21.459 { 00:17:21.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.459 "dma_device_type": 2 00:17:21.459 } 00:17:21.459 ], 00:17:21.459 "driver_specific": {} 00:17:21.459 }' 00:17:21.459 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.459 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.459 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:21.459 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.459 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.459 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:21.459 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.459 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.717 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:21.717 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.717 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.717 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:21.717 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:21.717 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:21.717 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:21.975 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:21.975 "name": "BaseBdev3", 00:17:21.975 "aliases": [ 00:17:21.975 "2691aec3-4893-47e9-978d-8dc1d3bc2afa" 00:17:21.975 ], 00:17:21.975 "product_name": "Malloc disk", 00:17:21.975 "block_size": 512, 00:17:21.975 "num_blocks": 65536, 00:17:21.975 "uuid": "2691aec3-4893-47e9-978d-8dc1d3bc2afa", 00:17:21.975 "assigned_rate_limits": { 00:17:21.975 "rw_ios_per_sec": 0, 00:17:21.975 "rw_mbytes_per_sec": 0, 00:17:21.975 "r_mbytes_per_sec": 0, 00:17:21.975 "w_mbytes_per_sec": 0 00:17:21.975 }, 00:17:21.975 "claimed": true, 00:17:21.975 "claim_type": "exclusive_write", 00:17:21.975 "zoned": false, 00:17:21.975 "supported_io_types": { 00:17:21.975 "read": true, 00:17:21.975 "write": true, 00:17:21.975 "unmap": true, 00:17:21.975 "flush": true, 00:17:21.975 "reset": true, 00:17:21.975 "nvme_admin": false, 00:17:21.975 "nvme_io": false, 00:17:21.975 "nvme_io_md": false, 00:17:21.975 "write_zeroes": true, 00:17:21.975 "zcopy": true, 00:17:21.975 "get_zone_info": false, 00:17:21.975 "zone_management": false, 00:17:21.975 "zone_append": false, 00:17:21.975 "compare": false, 00:17:21.975 "compare_and_write": false, 00:17:21.975 "abort": true, 00:17:21.975 "seek_hole": false, 00:17:21.975 "seek_data": false, 00:17:21.975 "copy": true, 00:17:21.975 "nvme_iov_md": false 00:17:21.975 }, 00:17:21.975 "memory_domains": [ 00:17:21.975 { 00:17:21.975 "dma_device_id": "system", 00:17:21.975 "dma_device_type": 1 00:17:21.975 }, 00:17:21.975 { 00:17:21.975 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.975 "dma_device_type": 2 00:17:21.975 } 00:17:21.975 ], 00:17:21.975 "driver_specific": {} 00:17:21.975 }' 00:17:21.975 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.975 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.975 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:21.975 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.975 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.975 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:21.975 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.975 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.233 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:22.233 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.233 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.233 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:22.233 05:45:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:22.492 [2024-07-26 05:45:37.198370] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:22.492 [2024-07-26 05:45:37.198397] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:22.492 [2024-07-26 05:45:37.198437] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.492 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.750 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.750 "name": "Existed_Raid", 00:17:22.750 "uuid": "9c21a187-02e8-480b-b243-fe7926fa5f1c", 00:17:22.750 "strip_size_kb": 64, 00:17:22.750 "state": "offline", 00:17:22.750 "raid_level": "concat", 00:17:22.750 "superblock": true, 00:17:22.750 "num_base_bdevs": 3, 00:17:22.750 "num_base_bdevs_discovered": 2, 00:17:22.750 "num_base_bdevs_operational": 2, 00:17:22.750 "base_bdevs_list": [ 00:17:22.750 { 00:17:22.750 "name": null, 00:17:22.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.750 "is_configured": false, 00:17:22.750 "data_offset": 2048, 00:17:22.750 "data_size": 63488 00:17:22.750 }, 00:17:22.750 { 00:17:22.750 "name": "BaseBdev2", 00:17:22.750 "uuid": "fc9be932-fdf7-40b1-a4fd-20eb3d0a67ee", 00:17:22.750 "is_configured": true, 00:17:22.750 "data_offset": 2048, 00:17:22.750 "data_size": 63488 00:17:22.750 }, 00:17:22.750 { 00:17:22.750 "name": "BaseBdev3", 00:17:22.750 "uuid": "2691aec3-4893-47e9-978d-8dc1d3bc2afa", 00:17:22.750 "is_configured": true, 00:17:22.750 "data_offset": 2048, 00:17:22.750 "data_size": 63488 00:17:22.750 } 00:17:22.750 ] 00:17:22.750 }' 00:17:22.750 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.750 05:45:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:23.316 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:23.316 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:23.316 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.316 05:45:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:23.316 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:23.316 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:23.316 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:23.575 [2024-07-26 05:45:38.442692] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:23.575 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:23.575 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:23.575 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.575 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:23.834 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:23.834 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:23.834 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:24.092 [2024-07-26 05:45:38.942610] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:24.092 [2024-07-26 05:45:38.942663] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1076400 name Existed_Raid, state offline 00:17:24.092 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:24.092 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:24.092 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.092 05:45:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:24.350 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:24.350 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:24.350 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:24.350 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:24.350 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:24.350 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:24.608 BaseBdev2 00:17:24.608 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:24.608 05:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:24.608 05:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:24.608 05:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:24.608 05:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:24.608 05:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:24.608 05:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:24.866 05:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:25.125 [ 00:17:25.125 { 00:17:25.125 "name": "BaseBdev2", 00:17:25.125 "aliases": [ 00:17:25.125 "bb91ee0a-1399-40fc-8901-8738f21f671f" 00:17:25.125 ], 00:17:25.125 "product_name": "Malloc disk", 00:17:25.125 "block_size": 512, 00:17:25.125 "num_blocks": 65536, 00:17:25.125 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:25.125 "assigned_rate_limits": { 00:17:25.125 "rw_ios_per_sec": 0, 00:17:25.125 "rw_mbytes_per_sec": 0, 00:17:25.125 "r_mbytes_per_sec": 0, 00:17:25.125 "w_mbytes_per_sec": 0 00:17:25.125 }, 00:17:25.125 "claimed": false, 00:17:25.125 "zoned": false, 00:17:25.125 "supported_io_types": { 00:17:25.125 "read": true, 00:17:25.125 "write": true, 00:17:25.125 "unmap": true, 00:17:25.125 "flush": true, 00:17:25.125 "reset": true, 00:17:25.125 "nvme_admin": false, 00:17:25.125 "nvme_io": false, 00:17:25.125 "nvme_io_md": false, 00:17:25.125 "write_zeroes": true, 00:17:25.125 "zcopy": true, 00:17:25.125 "get_zone_info": false, 00:17:25.125 "zone_management": false, 00:17:25.125 "zone_append": false, 00:17:25.125 "compare": false, 00:17:25.125 "compare_and_write": false, 00:17:25.125 "abort": true, 00:17:25.125 "seek_hole": false, 00:17:25.125 "seek_data": false, 00:17:25.125 "copy": true, 00:17:25.125 "nvme_iov_md": false 00:17:25.125 }, 00:17:25.125 "memory_domains": [ 00:17:25.125 { 00:17:25.125 "dma_device_id": "system", 00:17:25.125 "dma_device_type": 1 00:17:25.125 }, 00:17:25.125 { 00:17:25.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.125 "dma_device_type": 2 00:17:25.125 } 00:17:25.125 ], 00:17:25.125 "driver_specific": {} 00:17:25.125 } 00:17:25.125 ] 00:17:25.125 05:45:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:25.125 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:25.125 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:25.125 05:45:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:25.384 BaseBdev3 00:17:25.384 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:25.384 05:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:25.384 05:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:25.384 05:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:25.384 05:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:25.384 05:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:25.384 05:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:25.642 05:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:25.900 [ 00:17:25.900 { 00:17:25.900 "name": "BaseBdev3", 00:17:25.900 "aliases": [ 00:17:25.900 "0dd12748-6350-4233-8a5d-8ede825beebb" 00:17:25.900 ], 00:17:25.900 "product_name": "Malloc disk", 00:17:25.900 "block_size": 512, 00:17:25.900 "num_blocks": 65536, 00:17:25.900 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:25.900 "assigned_rate_limits": { 00:17:25.900 "rw_ios_per_sec": 0, 00:17:25.900 "rw_mbytes_per_sec": 0, 00:17:25.900 "r_mbytes_per_sec": 0, 00:17:25.900 "w_mbytes_per_sec": 0 00:17:25.900 }, 00:17:25.900 "claimed": false, 00:17:25.900 "zoned": false, 00:17:25.900 "supported_io_types": { 00:17:25.900 "read": true, 00:17:25.900 "write": true, 00:17:25.900 "unmap": true, 00:17:25.900 "flush": true, 00:17:25.900 "reset": true, 00:17:25.900 "nvme_admin": false, 00:17:25.900 "nvme_io": false, 00:17:25.900 "nvme_io_md": false, 00:17:25.900 "write_zeroes": true, 00:17:25.900 "zcopy": true, 00:17:25.900 "get_zone_info": false, 00:17:25.900 "zone_management": false, 00:17:25.900 "zone_append": false, 00:17:25.900 "compare": false, 00:17:25.900 "compare_and_write": false, 00:17:25.900 "abort": true, 00:17:25.900 "seek_hole": false, 00:17:25.900 "seek_data": false, 00:17:25.900 "copy": true, 00:17:25.900 "nvme_iov_md": false 00:17:25.900 }, 00:17:25.900 "memory_domains": [ 00:17:25.900 { 00:17:25.900 "dma_device_id": "system", 00:17:25.900 "dma_device_type": 1 00:17:25.900 }, 00:17:25.900 { 00:17:25.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.900 "dma_device_type": 2 00:17:25.900 } 00:17:25.900 ], 00:17:25.900 "driver_specific": {} 00:17:25.900 } 00:17:25.900 ] 00:17:25.900 05:45:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:25.900 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:25.900 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:25.900 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:26.159 [2024-07-26 05:45:40.893177] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:26.159 [2024-07-26 05:45:40.893222] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:26.159 [2024-07-26 05:45:40.893239] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:26.159 [2024-07-26 05:45:40.894630] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.159 05:45:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.417 05:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.417 "name": "Existed_Raid", 00:17:26.417 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:26.417 "strip_size_kb": 64, 00:17:26.417 "state": "configuring", 00:17:26.417 "raid_level": "concat", 00:17:26.417 "superblock": true, 00:17:26.417 "num_base_bdevs": 3, 00:17:26.417 "num_base_bdevs_discovered": 2, 00:17:26.417 "num_base_bdevs_operational": 3, 00:17:26.417 "base_bdevs_list": [ 00:17:26.417 { 00:17:26.417 "name": "BaseBdev1", 00:17:26.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.417 "is_configured": false, 00:17:26.417 "data_offset": 0, 00:17:26.417 "data_size": 0 00:17:26.417 }, 00:17:26.417 { 00:17:26.418 "name": "BaseBdev2", 00:17:26.418 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:26.418 "is_configured": true, 00:17:26.418 "data_offset": 2048, 00:17:26.418 "data_size": 63488 00:17:26.418 }, 00:17:26.418 { 00:17:26.418 "name": "BaseBdev3", 00:17:26.418 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:26.418 "is_configured": true, 00:17:26.418 "data_offset": 2048, 00:17:26.418 "data_size": 63488 00:17:26.418 } 00:17:26.418 ] 00:17:26.418 }' 00:17:26.418 05:45:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.418 05:45:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:27.352 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:27.610 [2024-07-26 05:45:42.268786] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.610 "name": "Existed_Raid", 00:17:27.610 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:27.610 "strip_size_kb": 64, 00:17:27.610 "state": "configuring", 00:17:27.610 "raid_level": "concat", 00:17:27.610 "superblock": true, 00:17:27.610 "num_base_bdevs": 3, 00:17:27.610 "num_base_bdevs_discovered": 1, 00:17:27.610 "num_base_bdevs_operational": 3, 00:17:27.610 "base_bdevs_list": [ 00:17:27.610 { 00:17:27.610 "name": "BaseBdev1", 00:17:27.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.610 "is_configured": false, 00:17:27.610 "data_offset": 0, 00:17:27.610 "data_size": 0 00:17:27.610 }, 00:17:27.610 { 00:17:27.610 "name": null, 00:17:27.610 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:27.610 "is_configured": false, 00:17:27.610 "data_offset": 2048, 00:17:27.610 "data_size": 63488 00:17:27.610 }, 00:17:27.610 { 00:17:27.610 "name": "BaseBdev3", 00:17:27.610 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:27.610 "is_configured": true, 00:17:27.610 "data_offset": 2048, 00:17:27.610 "data_size": 63488 00:17:27.610 } 00:17:27.610 ] 00:17:27.610 }' 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.610 05:45:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:28.176 05:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.176 05:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:28.434 05:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:28.434 05:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:28.691 [2024-07-26 05:45:43.551719] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:28.691 BaseBdev1 00:17:28.691 05:45:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:28.691 05:45:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:28.691 05:45:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:28.691 05:45:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:28.691 05:45:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:28.691 05:45:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:28.691 05:45:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:28.950 05:45:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:29.208 [ 00:17:29.208 { 00:17:29.208 "name": "BaseBdev1", 00:17:29.208 "aliases": [ 00:17:29.208 "19380e45-56e9-476f-b0b9-1e5b717feca1" 00:17:29.208 ], 00:17:29.208 "product_name": "Malloc disk", 00:17:29.208 "block_size": 512, 00:17:29.208 "num_blocks": 65536, 00:17:29.208 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:29.208 "assigned_rate_limits": { 00:17:29.208 "rw_ios_per_sec": 0, 00:17:29.208 "rw_mbytes_per_sec": 0, 00:17:29.208 "r_mbytes_per_sec": 0, 00:17:29.208 "w_mbytes_per_sec": 0 00:17:29.208 }, 00:17:29.208 "claimed": true, 00:17:29.208 "claim_type": "exclusive_write", 00:17:29.208 "zoned": false, 00:17:29.208 "supported_io_types": { 00:17:29.208 "read": true, 00:17:29.208 "write": true, 00:17:29.208 "unmap": true, 00:17:29.208 "flush": true, 00:17:29.208 "reset": true, 00:17:29.208 "nvme_admin": false, 00:17:29.208 "nvme_io": false, 00:17:29.208 "nvme_io_md": false, 00:17:29.208 "write_zeroes": true, 00:17:29.208 "zcopy": true, 00:17:29.208 "get_zone_info": false, 00:17:29.208 "zone_management": false, 00:17:29.208 "zone_append": false, 00:17:29.208 "compare": false, 00:17:29.208 "compare_and_write": false, 00:17:29.208 "abort": true, 00:17:29.208 "seek_hole": false, 00:17:29.208 "seek_data": false, 00:17:29.208 "copy": true, 00:17:29.209 "nvme_iov_md": false 00:17:29.209 }, 00:17:29.209 "memory_domains": [ 00:17:29.209 { 00:17:29.209 "dma_device_id": "system", 00:17:29.209 "dma_device_type": 1 00:17:29.209 }, 00:17:29.209 { 00:17:29.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.209 "dma_device_type": 2 00:17:29.209 } 00:17:29.209 ], 00:17:29.209 "driver_specific": {} 00:17:29.209 } 00:17:29.209 ] 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.209 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:29.468 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:29.468 "name": "Existed_Raid", 00:17:29.468 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:29.468 "strip_size_kb": 64, 00:17:29.468 "state": "configuring", 00:17:29.468 "raid_level": "concat", 00:17:29.468 "superblock": true, 00:17:29.468 "num_base_bdevs": 3, 00:17:29.468 "num_base_bdevs_discovered": 2, 00:17:29.468 "num_base_bdevs_operational": 3, 00:17:29.468 "base_bdevs_list": [ 00:17:29.468 { 00:17:29.468 "name": "BaseBdev1", 00:17:29.468 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:29.468 "is_configured": true, 00:17:29.468 "data_offset": 2048, 00:17:29.468 "data_size": 63488 00:17:29.468 }, 00:17:29.468 { 00:17:29.468 "name": null, 00:17:29.468 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:29.468 "is_configured": false, 00:17:29.468 "data_offset": 2048, 00:17:29.468 "data_size": 63488 00:17:29.468 }, 00:17:29.468 { 00:17:29.468 "name": "BaseBdev3", 00:17:29.468 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:29.468 "is_configured": true, 00:17:29.468 "data_offset": 2048, 00:17:29.468 "data_size": 63488 00:17:29.468 } 00:17:29.468 ] 00:17:29.468 }' 00:17:29.468 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:29.468 05:45:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:30.405 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.405 05:45:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:30.405 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:30.405 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:30.665 [2024-07-26 05:45:45.428713] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:30.665 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.666 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:30.924 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.924 "name": "Existed_Raid", 00:17:30.924 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:30.924 "strip_size_kb": 64, 00:17:30.924 "state": "configuring", 00:17:30.924 "raid_level": "concat", 00:17:30.924 "superblock": true, 00:17:30.924 "num_base_bdevs": 3, 00:17:30.924 "num_base_bdevs_discovered": 1, 00:17:30.924 "num_base_bdevs_operational": 3, 00:17:30.924 "base_bdevs_list": [ 00:17:30.924 { 00:17:30.924 "name": "BaseBdev1", 00:17:30.924 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:30.924 "is_configured": true, 00:17:30.924 "data_offset": 2048, 00:17:30.924 "data_size": 63488 00:17:30.924 }, 00:17:30.924 { 00:17:30.924 "name": null, 00:17:30.924 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:30.924 "is_configured": false, 00:17:30.924 "data_offset": 2048, 00:17:30.924 "data_size": 63488 00:17:30.924 }, 00:17:30.924 { 00:17:30.925 "name": null, 00:17:30.925 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:30.925 "is_configured": false, 00:17:30.925 "data_offset": 2048, 00:17:30.925 "data_size": 63488 00:17:30.925 } 00:17:30.925 ] 00:17:30.925 }' 00:17:30.925 05:45:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.925 05:45:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:31.493 05:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.493 05:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:31.780 05:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:31.780 05:45:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:32.346 [2024-07-26 05:45:47.000901] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.346 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:32.604 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.604 "name": "Existed_Raid", 00:17:32.604 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:32.604 "strip_size_kb": 64, 00:17:32.604 "state": "configuring", 00:17:32.604 "raid_level": "concat", 00:17:32.604 "superblock": true, 00:17:32.604 "num_base_bdevs": 3, 00:17:32.604 "num_base_bdevs_discovered": 2, 00:17:32.604 "num_base_bdevs_operational": 3, 00:17:32.604 "base_bdevs_list": [ 00:17:32.604 { 00:17:32.604 "name": "BaseBdev1", 00:17:32.604 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:32.604 "is_configured": true, 00:17:32.604 "data_offset": 2048, 00:17:32.604 "data_size": 63488 00:17:32.604 }, 00:17:32.604 { 00:17:32.604 "name": null, 00:17:32.604 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:32.604 "is_configured": false, 00:17:32.604 "data_offset": 2048, 00:17:32.604 "data_size": 63488 00:17:32.604 }, 00:17:32.604 { 00:17:32.604 "name": "BaseBdev3", 00:17:32.604 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:32.604 "is_configured": true, 00:17:32.604 "data_offset": 2048, 00:17:32.604 "data_size": 63488 00:17:32.604 } 00:17:32.604 ] 00:17:32.604 }' 00:17:32.604 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.604 05:45:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:33.171 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.171 05:45:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:33.429 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:33.429 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:33.429 [2024-07-26 05:45:48.336450] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.688 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:34.255 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.255 "name": "Existed_Raid", 00:17:34.255 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:34.255 "strip_size_kb": 64, 00:17:34.255 "state": "configuring", 00:17:34.255 "raid_level": "concat", 00:17:34.255 "superblock": true, 00:17:34.255 "num_base_bdevs": 3, 00:17:34.255 "num_base_bdevs_discovered": 1, 00:17:34.255 "num_base_bdevs_operational": 3, 00:17:34.255 "base_bdevs_list": [ 00:17:34.255 { 00:17:34.255 "name": null, 00:17:34.255 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:34.255 "is_configured": false, 00:17:34.255 "data_offset": 2048, 00:17:34.255 "data_size": 63488 00:17:34.255 }, 00:17:34.255 { 00:17:34.255 "name": null, 00:17:34.255 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:34.255 "is_configured": false, 00:17:34.255 "data_offset": 2048, 00:17:34.255 "data_size": 63488 00:17:34.255 }, 00:17:34.255 { 00:17:34.255 "name": "BaseBdev3", 00:17:34.255 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:34.255 "is_configured": true, 00:17:34.255 "data_offset": 2048, 00:17:34.255 "data_size": 63488 00:17:34.255 } 00:17:34.255 ] 00:17:34.255 }' 00:17:34.255 05:45:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.255 05:45:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:34.821 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.821 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:35.080 [2024-07-26 05:45:49.952928] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.080 05:45:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.647 05:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.647 "name": "Existed_Raid", 00:17:35.647 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:35.647 "strip_size_kb": 64, 00:17:35.647 "state": "configuring", 00:17:35.647 "raid_level": "concat", 00:17:35.647 "superblock": true, 00:17:35.647 "num_base_bdevs": 3, 00:17:35.647 "num_base_bdevs_discovered": 2, 00:17:35.647 "num_base_bdevs_operational": 3, 00:17:35.647 "base_bdevs_list": [ 00:17:35.647 { 00:17:35.647 "name": null, 00:17:35.647 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:35.647 "is_configured": false, 00:17:35.647 "data_offset": 2048, 00:17:35.647 "data_size": 63488 00:17:35.647 }, 00:17:35.647 { 00:17:35.647 "name": "BaseBdev2", 00:17:35.647 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:35.647 "is_configured": true, 00:17:35.647 "data_offset": 2048, 00:17:35.647 "data_size": 63488 00:17:35.647 }, 00:17:35.647 { 00:17:35.647 "name": "BaseBdev3", 00:17:35.647 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:35.647 "is_configured": true, 00:17:35.647 "data_offset": 2048, 00:17:35.647 "data_size": 63488 00:17:35.647 } 00:17:35.647 ] 00:17:35.647 }' 00:17:35.647 05:45:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.647 05:45:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:36.215 05:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.215 05:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:36.473 05:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:36.473 05:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.473 05:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:36.731 05:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 19380e45-56e9-476f-b0b9-1e5b717feca1 00:17:36.989 [2024-07-26 05:45:51.646026] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:36.989 [2024-07-26 05:45:51.646183] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1074f50 00:17:36.989 [2024-07-26 05:45:51.646196] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:36.989 [2024-07-26 05:45:51.646375] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd7b940 00:17:36.989 [2024-07-26 05:45:51.646490] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1074f50 00:17:36.989 [2024-07-26 05:45:51.646500] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1074f50 00:17:36.989 [2024-07-26 05:45:51.646591] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:36.989 NewBaseBdev 00:17:36.989 05:45:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:36.989 05:45:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:36.989 05:45:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:36.989 05:45:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:36.989 05:45:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:36.989 05:45:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:36.989 05:45:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:37.555 [ 00:17:37.555 { 00:17:37.555 "name": "NewBaseBdev", 00:17:37.555 "aliases": [ 00:17:37.555 "19380e45-56e9-476f-b0b9-1e5b717feca1" 00:17:37.555 ], 00:17:37.555 "product_name": "Malloc disk", 00:17:37.555 "block_size": 512, 00:17:37.555 "num_blocks": 65536, 00:17:37.555 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:37.555 "assigned_rate_limits": { 00:17:37.555 "rw_ios_per_sec": 0, 00:17:37.555 "rw_mbytes_per_sec": 0, 00:17:37.555 "r_mbytes_per_sec": 0, 00:17:37.555 "w_mbytes_per_sec": 0 00:17:37.555 }, 00:17:37.555 "claimed": true, 00:17:37.555 "claim_type": "exclusive_write", 00:17:37.555 "zoned": false, 00:17:37.555 "supported_io_types": { 00:17:37.555 "read": true, 00:17:37.555 "write": true, 00:17:37.555 "unmap": true, 00:17:37.555 "flush": true, 00:17:37.555 "reset": true, 00:17:37.555 "nvme_admin": false, 00:17:37.555 "nvme_io": false, 00:17:37.555 "nvme_io_md": false, 00:17:37.555 "write_zeroes": true, 00:17:37.555 "zcopy": true, 00:17:37.555 "get_zone_info": false, 00:17:37.555 "zone_management": false, 00:17:37.555 "zone_append": false, 00:17:37.555 "compare": false, 00:17:37.555 "compare_and_write": false, 00:17:37.555 "abort": true, 00:17:37.555 "seek_hole": false, 00:17:37.555 "seek_data": false, 00:17:37.555 "copy": true, 00:17:37.555 "nvme_iov_md": false 00:17:37.555 }, 00:17:37.555 "memory_domains": [ 00:17:37.555 { 00:17:37.555 "dma_device_id": "system", 00:17:37.555 "dma_device_type": 1 00:17:37.555 }, 00:17:37.555 { 00:17:37.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.555 "dma_device_type": 2 00:17:37.555 } 00:17:37.555 ], 00:17:37.555 "driver_specific": {} 00:17:37.555 } 00:17:37.555 ] 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.555 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.813 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.813 "name": "Existed_Raid", 00:17:37.813 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:37.813 "strip_size_kb": 64, 00:17:37.813 "state": "online", 00:17:37.813 "raid_level": "concat", 00:17:37.813 "superblock": true, 00:17:37.813 "num_base_bdevs": 3, 00:17:37.813 "num_base_bdevs_discovered": 3, 00:17:37.813 "num_base_bdevs_operational": 3, 00:17:37.813 "base_bdevs_list": [ 00:17:37.813 { 00:17:37.813 "name": "NewBaseBdev", 00:17:37.813 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:37.813 "is_configured": true, 00:17:37.813 "data_offset": 2048, 00:17:37.813 "data_size": 63488 00:17:37.813 }, 00:17:37.813 { 00:17:37.813 "name": "BaseBdev2", 00:17:37.813 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:37.813 "is_configured": true, 00:17:37.813 "data_offset": 2048, 00:17:37.813 "data_size": 63488 00:17:37.813 }, 00:17:37.813 { 00:17:37.813 "name": "BaseBdev3", 00:17:37.813 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:37.813 "is_configured": true, 00:17:37.813 "data_offset": 2048, 00:17:37.813 "data_size": 63488 00:17:37.813 } 00:17:37.813 ] 00:17:37.813 }' 00:17:37.813 05:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.813 05:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:38.380 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:38.380 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:38.380 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:38.380 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:38.380 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:38.380 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:38.380 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:38.380 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:38.636 [2024-07-26 05:45:53.415053] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:38.636 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:38.636 "name": "Existed_Raid", 00:17:38.636 "aliases": [ 00:17:38.636 "20c3f438-2bd7-4609-b2b5-eb7662d872ea" 00:17:38.636 ], 00:17:38.636 "product_name": "Raid Volume", 00:17:38.636 "block_size": 512, 00:17:38.636 "num_blocks": 190464, 00:17:38.636 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:38.636 "assigned_rate_limits": { 00:17:38.636 "rw_ios_per_sec": 0, 00:17:38.636 "rw_mbytes_per_sec": 0, 00:17:38.636 "r_mbytes_per_sec": 0, 00:17:38.636 "w_mbytes_per_sec": 0 00:17:38.636 }, 00:17:38.636 "claimed": false, 00:17:38.636 "zoned": false, 00:17:38.636 "supported_io_types": { 00:17:38.636 "read": true, 00:17:38.636 "write": true, 00:17:38.636 "unmap": true, 00:17:38.636 "flush": true, 00:17:38.636 "reset": true, 00:17:38.636 "nvme_admin": false, 00:17:38.636 "nvme_io": false, 00:17:38.636 "nvme_io_md": false, 00:17:38.636 "write_zeroes": true, 00:17:38.636 "zcopy": false, 00:17:38.636 "get_zone_info": false, 00:17:38.636 "zone_management": false, 00:17:38.636 "zone_append": false, 00:17:38.636 "compare": false, 00:17:38.636 "compare_and_write": false, 00:17:38.636 "abort": false, 00:17:38.636 "seek_hole": false, 00:17:38.636 "seek_data": false, 00:17:38.636 "copy": false, 00:17:38.636 "nvme_iov_md": false 00:17:38.636 }, 00:17:38.636 "memory_domains": [ 00:17:38.636 { 00:17:38.636 "dma_device_id": "system", 00:17:38.636 "dma_device_type": 1 00:17:38.636 }, 00:17:38.636 { 00:17:38.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.636 "dma_device_type": 2 00:17:38.636 }, 00:17:38.636 { 00:17:38.636 "dma_device_id": "system", 00:17:38.636 "dma_device_type": 1 00:17:38.636 }, 00:17:38.636 { 00:17:38.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.636 "dma_device_type": 2 00:17:38.636 }, 00:17:38.636 { 00:17:38.636 "dma_device_id": "system", 00:17:38.636 "dma_device_type": 1 00:17:38.636 }, 00:17:38.636 { 00:17:38.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.636 "dma_device_type": 2 00:17:38.636 } 00:17:38.636 ], 00:17:38.636 "driver_specific": { 00:17:38.636 "raid": { 00:17:38.636 "uuid": "20c3f438-2bd7-4609-b2b5-eb7662d872ea", 00:17:38.636 "strip_size_kb": 64, 00:17:38.637 "state": "online", 00:17:38.637 "raid_level": "concat", 00:17:38.637 "superblock": true, 00:17:38.637 "num_base_bdevs": 3, 00:17:38.637 "num_base_bdevs_discovered": 3, 00:17:38.637 "num_base_bdevs_operational": 3, 00:17:38.637 "base_bdevs_list": [ 00:17:38.637 { 00:17:38.637 "name": "NewBaseBdev", 00:17:38.637 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:38.637 "is_configured": true, 00:17:38.637 "data_offset": 2048, 00:17:38.637 "data_size": 63488 00:17:38.637 }, 00:17:38.637 { 00:17:38.637 "name": "BaseBdev2", 00:17:38.637 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:38.637 "is_configured": true, 00:17:38.637 "data_offset": 2048, 00:17:38.637 "data_size": 63488 00:17:38.637 }, 00:17:38.637 { 00:17:38.637 "name": "BaseBdev3", 00:17:38.637 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:38.637 "is_configured": true, 00:17:38.637 "data_offset": 2048, 00:17:38.637 "data_size": 63488 00:17:38.637 } 00:17:38.637 ] 00:17:38.637 } 00:17:38.637 } 00:17:38.637 }' 00:17:38.637 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:38.637 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:38.637 BaseBdev2 00:17:38.637 BaseBdev3' 00:17:38.637 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:38.637 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:38.637 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:38.894 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:38.894 "name": "NewBaseBdev", 00:17:38.894 "aliases": [ 00:17:38.894 "19380e45-56e9-476f-b0b9-1e5b717feca1" 00:17:38.894 ], 00:17:38.894 "product_name": "Malloc disk", 00:17:38.894 "block_size": 512, 00:17:38.894 "num_blocks": 65536, 00:17:38.894 "uuid": "19380e45-56e9-476f-b0b9-1e5b717feca1", 00:17:38.894 "assigned_rate_limits": { 00:17:38.894 "rw_ios_per_sec": 0, 00:17:38.894 "rw_mbytes_per_sec": 0, 00:17:38.894 "r_mbytes_per_sec": 0, 00:17:38.894 "w_mbytes_per_sec": 0 00:17:38.894 }, 00:17:38.894 "claimed": true, 00:17:38.894 "claim_type": "exclusive_write", 00:17:38.894 "zoned": false, 00:17:38.894 "supported_io_types": { 00:17:38.894 "read": true, 00:17:38.894 "write": true, 00:17:38.894 "unmap": true, 00:17:38.894 "flush": true, 00:17:38.894 "reset": true, 00:17:38.894 "nvme_admin": false, 00:17:38.894 "nvme_io": false, 00:17:38.894 "nvme_io_md": false, 00:17:38.894 "write_zeroes": true, 00:17:38.894 "zcopy": true, 00:17:38.894 "get_zone_info": false, 00:17:38.894 "zone_management": false, 00:17:38.894 "zone_append": false, 00:17:38.894 "compare": false, 00:17:38.894 "compare_and_write": false, 00:17:38.894 "abort": true, 00:17:38.894 "seek_hole": false, 00:17:38.894 "seek_data": false, 00:17:38.894 "copy": true, 00:17:38.894 "nvme_iov_md": false 00:17:38.894 }, 00:17:38.894 "memory_domains": [ 00:17:38.894 { 00:17:38.894 "dma_device_id": "system", 00:17:38.894 "dma_device_type": 1 00:17:38.894 }, 00:17:38.894 { 00:17:38.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.894 "dma_device_type": 2 00:17:38.894 } 00:17:38.894 ], 00:17:38.894 "driver_specific": {} 00:17:38.894 }' 00:17:38.894 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:38.894 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:38.894 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:38.894 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:38.894 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:39.151 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:39.151 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:39.151 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:39.151 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:39.151 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:39.151 05:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:39.151 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:39.151 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:39.151 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:39.151 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:39.410 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:39.410 "name": "BaseBdev2", 00:17:39.410 "aliases": [ 00:17:39.410 "bb91ee0a-1399-40fc-8901-8738f21f671f" 00:17:39.410 ], 00:17:39.410 "product_name": "Malloc disk", 00:17:39.410 "block_size": 512, 00:17:39.410 "num_blocks": 65536, 00:17:39.410 "uuid": "bb91ee0a-1399-40fc-8901-8738f21f671f", 00:17:39.410 "assigned_rate_limits": { 00:17:39.410 "rw_ios_per_sec": 0, 00:17:39.410 "rw_mbytes_per_sec": 0, 00:17:39.410 "r_mbytes_per_sec": 0, 00:17:39.410 "w_mbytes_per_sec": 0 00:17:39.410 }, 00:17:39.410 "claimed": true, 00:17:39.410 "claim_type": "exclusive_write", 00:17:39.410 "zoned": false, 00:17:39.410 "supported_io_types": { 00:17:39.410 "read": true, 00:17:39.410 "write": true, 00:17:39.410 "unmap": true, 00:17:39.410 "flush": true, 00:17:39.410 "reset": true, 00:17:39.410 "nvme_admin": false, 00:17:39.410 "nvme_io": false, 00:17:39.410 "nvme_io_md": false, 00:17:39.410 "write_zeroes": true, 00:17:39.410 "zcopy": true, 00:17:39.410 "get_zone_info": false, 00:17:39.410 "zone_management": false, 00:17:39.410 "zone_append": false, 00:17:39.410 "compare": false, 00:17:39.410 "compare_and_write": false, 00:17:39.410 "abort": true, 00:17:39.410 "seek_hole": false, 00:17:39.410 "seek_data": false, 00:17:39.410 "copy": true, 00:17:39.410 "nvme_iov_md": false 00:17:39.410 }, 00:17:39.410 "memory_domains": [ 00:17:39.410 { 00:17:39.410 "dma_device_id": "system", 00:17:39.410 "dma_device_type": 1 00:17:39.410 }, 00:17:39.410 { 00:17:39.410 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.410 "dma_device_type": 2 00:17:39.410 } 00:17:39.410 ], 00:17:39.410 "driver_specific": {} 00:17:39.410 }' 00:17:39.410 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:39.410 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:39.668 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:39.668 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:39.668 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:39.668 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:39.668 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:39.668 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:39.668 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:39.668 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:39.927 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:39.927 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:39.927 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:39.927 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:39.927 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:40.184 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:40.184 "name": "BaseBdev3", 00:17:40.184 "aliases": [ 00:17:40.184 "0dd12748-6350-4233-8a5d-8ede825beebb" 00:17:40.184 ], 00:17:40.184 "product_name": "Malloc disk", 00:17:40.184 "block_size": 512, 00:17:40.184 "num_blocks": 65536, 00:17:40.184 "uuid": "0dd12748-6350-4233-8a5d-8ede825beebb", 00:17:40.184 "assigned_rate_limits": { 00:17:40.184 "rw_ios_per_sec": 0, 00:17:40.184 "rw_mbytes_per_sec": 0, 00:17:40.184 "r_mbytes_per_sec": 0, 00:17:40.184 "w_mbytes_per_sec": 0 00:17:40.184 }, 00:17:40.184 "claimed": true, 00:17:40.184 "claim_type": "exclusive_write", 00:17:40.185 "zoned": false, 00:17:40.185 "supported_io_types": { 00:17:40.185 "read": true, 00:17:40.185 "write": true, 00:17:40.185 "unmap": true, 00:17:40.185 "flush": true, 00:17:40.185 "reset": true, 00:17:40.185 "nvme_admin": false, 00:17:40.185 "nvme_io": false, 00:17:40.185 "nvme_io_md": false, 00:17:40.185 "write_zeroes": true, 00:17:40.185 "zcopy": true, 00:17:40.185 "get_zone_info": false, 00:17:40.185 "zone_management": false, 00:17:40.185 "zone_append": false, 00:17:40.185 "compare": false, 00:17:40.185 "compare_and_write": false, 00:17:40.185 "abort": true, 00:17:40.185 "seek_hole": false, 00:17:40.185 "seek_data": false, 00:17:40.185 "copy": true, 00:17:40.185 "nvme_iov_md": false 00:17:40.185 }, 00:17:40.185 "memory_domains": [ 00:17:40.185 { 00:17:40.185 "dma_device_id": "system", 00:17:40.185 "dma_device_type": 1 00:17:40.185 }, 00:17:40.185 { 00:17:40.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.185 "dma_device_type": 2 00:17:40.185 } 00:17:40.185 ], 00:17:40.185 "driver_specific": {} 00:17:40.185 }' 00:17:40.185 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.185 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.185 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:40.185 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.185 05:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.185 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:40.185 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.185 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.443 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:40.443 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.443 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.443 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:40.443 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:40.702 [2024-07-26 05:45:55.428095] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:40.702 [2024-07-26 05:45:55.428126] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:40.702 [2024-07-26 05:45:55.428184] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:40.702 [2024-07-26 05:45:55.428236] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:40.702 [2024-07-26 05:45:55.428248] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1074f50 name Existed_Raid, state offline 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1165251 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1165251 ']' 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1165251 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1165251 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1165251' 00:17:40.702 killing process with pid 1165251 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1165251 00:17:40.702 [2024-07-26 05:45:55.501079] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:40.702 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1165251 00:17:40.702 [2024-07-26 05:45:55.531119] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:40.961 05:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:40.961 00:17:40.961 real 0m30.251s 00:17:40.961 user 0m55.641s 00:17:40.961 sys 0m5.283s 00:17:40.961 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:40.961 05:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:40.961 ************************************ 00:17:40.961 END TEST raid_state_function_test_sb 00:17:40.961 ************************************ 00:17:40.961 05:45:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:40.961 05:45:55 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:17:40.961 05:45:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:40.961 05:45:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:40.961 05:45:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:40.961 ************************************ 00:17:40.961 START TEST raid_superblock_test 00:17:40.961 ************************************ 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1169726 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1169726 /var/tmp/spdk-raid.sock 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1169726 ']' 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:40.961 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:40.961 05:45:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.220 [2024-07-26 05:45:55.908797] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:17:41.220 [2024-07-26 05:45:55.908858] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1169726 ] 00:17:41.220 [2024-07-26 05:45:56.025927] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:41.478 [2024-07-26 05:45:56.128324] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:41.478 [2024-07-26 05:45:56.191039] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:41.478 [2024-07-26 05:45:56.191072] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:42.045 05:45:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:42.303 malloc1 00:17:42.303 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:42.560 [2024-07-26 05:45:57.331065] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:42.560 [2024-07-26 05:45:57.331118] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:42.560 [2024-07-26 05:45:57.331138] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1425570 00:17:42.560 [2024-07-26 05:45:57.331150] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:42.560 [2024-07-26 05:45:57.332757] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:42.560 [2024-07-26 05:45:57.332786] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:42.560 pt1 00:17:42.560 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:42.560 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:42.560 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:42.560 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:42.560 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:42.560 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:42.560 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:42.560 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:42.561 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:42.818 malloc2 00:17:42.818 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:43.076 [2024-07-26 05:45:57.769118] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:43.076 [2024-07-26 05:45:57.769162] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:43.076 [2024-07-26 05:45:57.769178] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1426970 00:17:43.076 [2024-07-26 05:45:57.769191] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:43.076 [2024-07-26 05:45:57.770634] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:43.076 [2024-07-26 05:45:57.770668] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:43.076 pt2 00:17:43.076 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:43.076 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:43.076 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:43.076 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:43.077 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:43.077 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:43.077 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:43.077 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:43.077 05:45:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:43.334 malloc3 00:17:43.334 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:43.592 [2024-07-26 05:45:58.267022] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:43.592 [2024-07-26 05:45:58.267067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:43.592 [2024-07-26 05:45:58.267084] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15bd340 00:17:43.592 [2024-07-26 05:45:58.267097] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:43.592 [2024-07-26 05:45:58.268469] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:43.593 [2024-07-26 05:45:58.268496] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:43.593 pt3 00:17:43.593 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:43.593 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:43.593 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:43.850 [2024-07-26 05:45:58.511703] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:43.850 [2024-07-26 05:45:58.512927] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:43.850 [2024-07-26 05:45:58.512982] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:43.850 [2024-07-26 05:45:58.513129] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x141dea0 00:17:43.850 [2024-07-26 05:45:58.513140] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:43.850 [2024-07-26 05:45:58.513334] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1425240 00:17:43.850 [2024-07-26 05:45:58.513472] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x141dea0 00:17:43.850 [2024-07-26 05:45:58.513482] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x141dea0 00:17:43.850 [2024-07-26 05:45:58.513572] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:43.850 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.108 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.108 "name": "raid_bdev1", 00:17:44.108 "uuid": "053894e1-98d0-43aa-9840-88070edc11d6", 00:17:44.108 "strip_size_kb": 64, 00:17:44.108 "state": "online", 00:17:44.108 "raid_level": "concat", 00:17:44.108 "superblock": true, 00:17:44.108 "num_base_bdevs": 3, 00:17:44.108 "num_base_bdevs_discovered": 3, 00:17:44.108 "num_base_bdevs_operational": 3, 00:17:44.108 "base_bdevs_list": [ 00:17:44.108 { 00:17:44.108 "name": "pt1", 00:17:44.108 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:44.108 "is_configured": true, 00:17:44.108 "data_offset": 2048, 00:17:44.108 "data_size": 63488 00:17:44.108 }, 00:17:44.108 { 00:17:44.108 "name": "pt2", 00:17:44.108 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:44.108 "is_configured": true, 00:17:44.108 "data_offset": 2048, 00:17:44.108 "data_size": 63488 00:17:44.108 }, 00:17:44.108 { 00:17:44.108 "name": "pt3", 00:17:44.109 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:44.109 "is_configured": true, 00:17:44.109 "data_offset": 2048, 00:17:44.109 "data_size": 63488 00:17:44.109 } 00:17:44.109 ] 00:17:44.109 }' 00:17:44.109 05:45:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.109 05:45:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.674 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:44.674 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:44.674 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:44.674 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:44.674 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:44.674 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:44.674 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:44.674 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:44.931 [2024-07-26 05:45:59.582794] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:44.932 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:44.932 "name": "raid_bdev1", 00:17:44.932 "aliases": [ 00:17:44.932 "053894e1-98d0-43aa-9840-88070edc11d6" 00:17:44.932 ], 00:17:44.932 "product_name": "Raid Volume", 00:17:44.932 "block_size": 512, 00:17:44.932 "num_blocks": 190464, 00:17:44.932 "uuid": "053894e1-98d0-43aa-9840-88070edc11d6", 00:17:44.932 "assigned_rate_limits": { 00:17:44.932 "rw_ios_per_sec": 0, 00:17:44.932 "rw_mbytes_per_sec": 0, 00:17:44.932 "r_mbytes_per_sec": 0, 00:17:44.932 "w_mbytes_per_sec": 0 00:17:44.932 }, 00:17:44.932 "claimed": false, 00:17:44.932 "zoned": false, 00:17:44.932 "supported_io_types": { 00:17:44.932 "read": true, 00:17:44.932 "write": true, 00:17:44.932 "unmap": true, 00:17:44.932 "flush": true, 00:17:44.932 "reset": true, 00:17:44.932 "nvme_admin": false, 00:17:44.932 "nvme_io": false, 00:17:44.932 "nvme_io_md": false, 00:17:44.932 "write_zeroes": true, 00:17:44.932 "zcopy": false, 00:17:44.932 "get_zone_info": false, 00:17:44.932 "zone_management": false, 00:17:44.932 "zone_append": false, 00:17:44.932 "compare": false, 00:17:44.932 "compare_and_write": false, 00:17:44.932 "abort": false, 00:17:44.932 "seek_hole": false, 00:17:44.932 "seek_data": false, 00:17:44.932 "copy": false, 00:17:44.932 "nvme_iov_md": false 00:17:44.932 }, 00:17:44.932 "memory_domains": [ 00:17:44.932 { 00:17:44.932 "dma_device_id": "system", 00:17:44.932 "dma_device_type": 1 00:17:44.932 }, 00:17:44.932 { 00:17:44.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.932 "dma_device_type": 2 00:17:44.932 }, 00:17:44.932 { 00:17:44.932 "dma_device_id": "system", 00:17:44.932 "dma_device_type": 1 00:17:44.932 }, 00:17:44.932 { 00:17:44.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.932 "dma_device_type": 2 00:17:44.932 }, 00:17:44.932 { 00:17:44.932 "dma_device_id": "system", 00:17:44.932 "dma_device_type": 1 00:17:44.932 }, 00:17:44.932 { 00:17:44.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.932 "dma_device_type": 2 00:17:44.932 } 00:17:44.932 ], 00:17:44.932 "driver_specific": { 00:17:44.932 "raid": { 00:17:44.932 "uuid": "053894e1-98d0-43aa-9840-88070edc11d6", 00:17:44.932 "strip_size_kb": 64, 00:17:44.932 "state": "online", 00:17:44.932 "raid_level": "concat", 00:17:44.932 "superblock": true, 00:17:44.932 "num_base_bdevs": 3, 00:17:44.932 "num_base_bdevs_discovered": 3, 00:17:44.932 "num_base_bdevs_operational": 3, 00:17:44.932 "base_bdevs_list": [ 00:17:44.932 { 00:17:44.932 "name": "pt1", 00:17:44.932 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:44.932 "is_configured": true, 00:17:44.932 "data_offset": 2048, 00:17:44.932 "data_size": 63488 00:17:44.932 }, 00:17:44.932 { 00:17:44.932 "name": "pt2", 00:17:44.932 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:44.932 "is_configured": true, 00:17:44.932 "data_offset": 2048, 00:17:44.932 "data_size": 63488 00:17:44.932 }, 00:17:44.932 { 00:17:44.932 "name": "pt3", 00:17:44.932 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:44.932 "is_configured": true, 00:17:44.932 "data_offset": 2048, 00:17:44.932 "data_size": 63488 00:17:44.932 } 00:17:44.932 ] 00:17:44.932 } 00:17:44.932 } 00:17:44.932 }' 00:17:44.932 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:44.932 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:44.932 pt2 00:17:44.932 pt3' 00:17:44.932 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:44.932 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:44.932 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.189 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.189 "name": "pt1", 00:17:45.189 "aliases": [ 00:17:45.189 "00000000-0000-0000-0000-000000000001" 00:17:45.189 ], 00:17:45.189 "product_name": "passthru", 00:17:45.189 "block_size": 512, 00:17:45.189 "num_blocks": 65536, 00:17:45.189 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:45.189 "assigned_rate_limits": { 00:17:45.189 "rw_ios_per_sec": 0, 00:17:45.189 "rw_mbytes_per_sec": 0, 00:17:45.189 "r_mbytes_per_sec": 0, 00:17:45.189 "w_mbytes_per_sec": 0 00:17:45.189 }, 00:17:45.189 "claimed": true, 00:17:45.189 "claim_type": "exclusive_write", 00:17:45.189 "zoned": false, 00:17:45.189 "supported_io_types": { 00:17:45.189 "read": true, 00:17:45.189 "write": true, 00:17:45.189 "unmap": true, 00:17:45.189 "flush": true, 00:17:45.189 "reset": true, 00:17:45.189 "nvme_admin": false, 00:17:45.189 "nvme_io": false, 00:17:45.189 "nvme_io_md": false, 00:17:45.189 "write_zeroes": true, 00:17:45.189 "zcopy": true, 00:17:45.189 "get_zone_info": false, 00:17:45.189 "zone_management": false, 00:17:45.190 "zone_append": false, 00:17:45.190 "compare": false, 00:17:45.190 "compare_and_write": false, 00:17:45.190 "abort": true, 00:17:45.190 "seek_hole": false, 00:17:45.190 "seek_data": false, 00:17:45.190 "copy": true, 00:17:45.190 "nvme_iov_md": false 00:17:45.190 }, 00:17:45.190 "memory_domains": [ 00:17:45.190 { 00:17:45.190 "dma_device_id": "system", 00:17:45.190 "dma_device_type": 1 00:17:45.190 }, 00:17:45.190 { 00:17:45.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.190 "dma_device_type": 2 00:17:45.190 } 00:17:45.190 ], 00:17:45.190 "driver_specific": { 00:17:45.190 "passthru": { 00:17:45.190 "name": "pt1", 00:17:45.190 "base_bdev_name": "malloc1" 00:17:45.190 } 00:17:45.190 } 00:17:45.190 }' 00:17:45.190 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.190 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.190 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.190 05:45:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.190 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.190 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.190 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.190 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.447 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.447 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.447 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.447 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.447 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.447 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:45.447 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.705 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.705 "name": "pt2", 00:17:45.705 "aliases": [ 00:17:45.705 "00000000-0000-0000-0000-000000000002" 00:17:45.705 ], 00:17:45.705 "product_name": "passthru", 00:17:45.705 "block_size": 512, 00:17:45.705 "num_blocks": 65536, 00:17:45.705 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:45.705 "assigned_rate_limits": { 00:17:45.705 "rw_ios_per_sec": 0, 00:17:45.705 "rw_mbytes_per_sec": 0, 00:17:45.705 "r_mbytes_per_sec": 0, 00:17:45.705 "w_mbytes_per_sec": 0 00:17:45.705 }, 00:17:45.705 "claimed": true, 00:17:45.705 "claim_type": "exclusive_write", 00:17:45.705 "zoned": false, 00:17:45.705 "supported_io_types": { 00:17:45.705 "read": true, 00:17:45.705 "write": true, 00:17:45.705 "unmap": true, 00:17:45.705 "flush": true, 00:17:45.705 "reset": true, 00:17:45.705 "nvme_admin": false, 00:17:45.705 "nvme_io": false, 00:17:45.705 "nvme_io_md": false, 00:17:45.705 "write_zeroes": true, 00:17:45.705 "zcopy": true, 00:17:45.705 "get_zone_info": false, 00:17:45.705 "zone_management": false, 00:17:45.705 "zone_append": false, 00:17:45.705 "compare": false, 00:17:45.705 "compare_and_write": false, 00:17:45.705 "abort": true, 00:17:45.705 "seek_hole": false, 00:17:45.705 "seek_data": false, 00:17:45.705 "copy": true, 00:17:45.705 "nvme_iov_md": false 00:17:45.705 }, 00:17:45.705 "memory_domains": [ 00:17:45.705 { 00:17:45.705 "dma_device_id": "system", 00:17:45.705 "dma_device_type": 1 00:17:45.705 }, 00:17:45.706 { 00:17:45.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.706 "dma_device_type": 2 00:17:45.706 } 00:17:45.706 ], 00:17:45.706 "driver_specific": { 00:17:45.706 "passthru": { 00:17:45.706 "name": "pt2", 00:17:45.706 "base_bdev_name": "malloc2" 00:17:45.706 } 00:17:45.706 } 00:17:45.706 }' 00:17:45.706 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.706 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.706 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.706 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.706 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.963 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.963 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.963 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.963 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.963 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.963 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.963 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.964 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.964 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.964 05:46:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:46.221 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:46.221 "name": "pt3", 00:17:46.221 "aliases": [ 00:17:46.221 "00000000-0000-0000-0000-000000000003" 00:17:46.221 ], 00:17:46.221 "product_name": "passthru", 00:17:46.221 "block_size": 512, 00:17:46.221 "num_blocks": 65536, 00:17:46.221 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:46.221 "assigned_rate_limits": { 00:17:46.221 "rw_ios_per_sec": 0, 00:17:46.221 "rw_mbytes_per_sec": 0, 00:17:46.221 "r_mbytes_per_sec": 0, 00:17:46.221 "w_mbytes_per_sec": 0 00:17:46.221 }, 00:17:46.221 "claimed": true, 00:17:46.221 "claim_type": "exclusive_write", 00:17:46.221 "zoned": false, 00:17:46.221 "supported_io_types": { 00:17:46.221 "read": true, 00:17:46.221 "write": true, 00:17:46.221 "unmap": true, 00:17:46.221 "flush": true, 00:17:46.221 "reset": true, 00:17:46.221 "nvme_admin": false, 00:17:46.221 "nvme_io": false, 00:17:46.221 "nvme_io_md": false, 00:17:46.221 "write_zeroes": true, 00:17:46.221 "zcopy": true, 00:17:46.221 "get_zone_info": false, 00:17:46.221 "zone_management": false, 00:17:46.221 "zone_append": false, 00:17:46.221 "compare": false, 00:17:46.221 "compare_and_write": false, 00:17:46.221 "abort": true, 00:17:46.221 "seek_hole": false, 00:17:46.221 "seek_data": false, 00:17:46.221 "copy": true, 00:17:46.221 "nvme_iov_md": false 00:17:46.221 }, 00:17:46.221 "memory_domains": [ 00:17:46.221 { 00:17:46.221 "dma_device_id": "system", 00:17:46.221 "dma_device_type": 1 00:17:46.221 }, 00:17:46.221 { 00:17:46.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.221 "dma_device_type": 2 00:17:46.221 } 00:17:46.221 ], 00:17:46.221 "driver_specific": { 00:17:46.221 "passthru": { 00:17:46.221 "name": "pt3", 00:17:46.221 "base_bdev_name": "malloc3" 00:17:46.221 } 00:17:46.221 } 00:17:46.221 }' 00:17:46.221 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.221 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.479 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:46.479 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.479 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.479 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:46.479 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.479 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.479 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.479 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.479 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.736 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.736 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:46.736 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:46.993 [2024-07-26 05:46:01.652260] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:46.993 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=053894e1-98d0-43aa-9840-88070edc11d6 00:17:46.993 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 053894e1-98d0-43aa-9840-88070edc11d6 ']' 00:17:46.993 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:46.993 [2024-07-26 05:46:01.900626] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:46.993 [2024-07-26 05:46:01.900656] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:46.993 [2024-07-26 05:46:01.900711] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:46.993 [2024-07-26 05:46:01.900765] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:46.993 [2024-07-26 05:46:01.900777] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x141dea0 name raid_bdev1, state offline 00:17:47.252 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.252 05:46:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:47.510 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:47.510 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:47.510 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:47.510 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:47.510 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:47.510 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:47.769 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:47.769 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:48.027 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:48.027 05:46:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:48.321 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:48.579 [2024-07-26 05:46:03.376476] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:48.579 [2024-07-26 05:46:03.377865] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:48.579 [2024-07-26 05:46:03.377909] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:48.579 [2024-07-26 05:46:03.377954] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:48.579 [2024-07-26 05:46:03.377995] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:48.579 [2024-07-26 05:46:03.378018] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:48.579 [2024-07-26 05:46:03.378037] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:48.579 [2024-07-26 05:46:03.378047] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15c8ff0 name raid_bdev1, state configuring 00:17:48.580 request: 00:17:48.580 { 00:17:48.580 "name": "raid_bdev1", 00:17:48.580 "raid_level": "concat", 00:17:48.580 "base_bdevs": [ 00:17:48.580 "malloc1", 00:17:48.580 "malloc2", 00:17:48.580 "malloc3" 00:17:48.580 ], 00:17:48.580 "strip_size_kb": 64, 00:17:48.580 "superblock": false, 00:17:48.580 "method": "bdev_raid_create", 00:17:48.580 "req_id": 1 00:17:48.580 } 00:17:48.580 Got JSON-RPC error response 00:17:48.580 response: 00:17:48.580 { 00:17:48.580 "code": -17, 00:17:48.580 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:48.580 } 00:17:48.580 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:48.580 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:48.580 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:48.580 05:46:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:48.580 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.580 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:48.837 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:48.837 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:48.837 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:49.096 [2024-07-26 05:46:03.873723] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:49.096 [2024-07-26 05:46:03.873769] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:49.096 [2024-07-26 05:46:03.873790] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14257a0 00:17:49.096 [2024-07-26 05:46:03.873804] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:49.096 [2024-07-26 05:46:03.875463] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:49.096 [2024-07-26 05:46:03.875493] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:49.096 [2024-07-26 05:46:03.875577] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:49.096 [2024-07-26 05:46:03.875604] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:49.096 pt1 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.096 05:46:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:49.355 05:46:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.355 "name": "raid_bdev1", 00:17:49.355 "uuid": "053894e1-98d0-43aa-9840-88070edc11d6", 00:17:49.355 "strip_size_kb": 64, 00:17:49.355 "state": "configuring", 00:17:49.355 "raid_level": "concat", 00:17:49.355 "superblock": true, 00:17:49.355 "num_base_bdevs": 3, 00:17:49.355 "num_base_bdevs_discovered": 1, 00:17:49.355 "num_base_bdevs_operational": 3, 00:17:49.355 "base_bdevs_list": [ 00:17:49.355 { 00:17:49.355 "name": "pt1", 00:17:49.355 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:49.355 "is_configured": true, 00:17:49.355 "data_offset": 2048, 00:17:49.355 "data_size": 63488 00:17:49.355 }, 00:17:49.355 { 00:17:49.355 "name": null, 00:17:49.355 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:49.355 "is_configured": false, 00:17:49.355 "data_offset": 2048, 00:17:49.355 "data_size": 63488 00:17:49.355 }, 00:17:49.355 { 00:17:49.355 "name": null, 00:17:49.355 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:49.355 "is_configured": false, 00:17:49.355 "data_offset": 2048, 00:17:49.355 "data_size": 63488 00:17:49.355 } 00:17:49.355 ] 00:17:49.355 }' 00:17:49.355 05:46:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.355 05:46:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.921 05:46:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:49.921 05:46:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:50.180 [2024-07-26 05:46:04.956606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:50.180 [2024-07-26 05:46:04.956663] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:50.180 [2024-07-26 05:46:04.956681] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x141cc70 00:17:50.180 [2024-07-26 05:46:04.956694] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:50.180 [2024-07-26 05:46:04.957055] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:50.180 [2024-07-26 05:46:04.957073] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:50.180 [2024-07-26 05:46:04.957144] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:50.180 [2024-07-26 05:46:04.957164] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:50.180 pt2 00:17:50.180 05:46:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:50.439 [2024-07-26 05:46:05.197273] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.439 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:50.698 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.698 "name": "raid_bdev1", 00:17:50.698 "uuid": "053894e1-98d0-43aa-9840-88070edc11d6", 00:17:50.698 "strip_size_kb": 64, 00:17:50.698 "state": "configuring", 00:17:50.698 "raid_level": "concat", 00:17:50.698 "superblock": true, 00:17:50.698 "num_base_bdevs": 3, 00:17:50.698 "num_base_bdevs_discovered": 1, 00:17:50.698 "num_base_bdevs_operational": 3, 00:17:50.698 "base_bdevs_list": [ 00:17:50.698 { 00:17:50.698 "name": "pt1", 00:17:50.698 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:50.698 "is_configured": true, 00:17:50.698 "data_offset": 2048, 00:17:50.698 "data_size": 63488 00:17:50.698 }, 00:17:50.698 { 00:17:50.698 "name": null, 00:17:50.698 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:50.698 "is_configured": false, 00:17:50.698 "data_offset": 2048, 00:17:50.698 "data_size": 63488 00:17:50.698 }, 00:17:50.698 { 00:17:50.698 "name": null, 00:17:50.698 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:50.698 "is_configured": false, 00:17:50.698 "data_offset": 2048, 00:17:50.698 "data_size": 63488 00:17:50.698 } 00:17:50.698 ] 00:17:50.698 }' 00:17:50.698 05:46:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.698 05:46:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.266 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:51.266 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:51.266 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:51.525 [2024-07-26 05:46:06.292157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:51.525 [2024-07-26 05:46:06.292214] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:51.525 [2024-07-26 05:46:06.292237] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1425a10 00:17:51.525 [2024-07-26 05:46:06.292249] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:51.525 [2024-07-26 05:46:06.292604] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:51.525 [2024-07-26 05:46:06.292621] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:51.525 [2024-07-26 05:46:06.292697] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:51.525 [2024-07-26 05:46:06.292718] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:51.525 pt2 00:17:51.525 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:51.525 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:51.525 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:51.784 [2024-07-26 05:46:06.536807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:51.784 [2024-07-26 05:46:06.536852] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:51.784 [2024-07-26 05:46:06.536869] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15bf740 00:17:51.784 [2024-07-26 05:46:06.536882] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:51.784 [2024-07-26 05:46:06.537208] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:51.784 [2024-07-26 05:46:06.537226] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:51.784 [2024-07-26 05:46:06.537288] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:51.784 [2024-07-26 05:46:06.537306] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:51.784 [2024-07-26 05:46:06.537415] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15bfc00 00:17:51.784 [2024-07-26 05:46:06.537426] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:51.784 [2024-07-26 05:46:06.537596] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1424a40 00:17:51.784 [2024-07-26 05:46:06.537734] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15bfc00 00:17:51.784 [2024-07-26 05:46:06.537744] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15bfc00 00:17:51.784 [2024-07-26 05:46:06.537839] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:51.784 pt3 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.784 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:52.043 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.043 "name": "raid_bdev1", 00:17:52.043 "uuid": "053894e1-98d0-43aa-9840-88070edc11d6", 00:17:52.043 "strip_size_kb": 64, 00:17:52.043 "state": "online", 00:17:52.043 "raid_level": "concat", 00:17:52.043 "superblock": true, 00:17:52.043 "num_base_bdevs": 3, 00:17:52.043 "num_base_bdevs_discovered": 3, 00:17:52.043 "num_base_bdevs_operational": 3, 00:17:52.043 "base_bdevs_list": [ 00:17:52.043 { 00:17:52.043 "name": "pt1", 00:17:52.043 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:52.043 "is_configured": true, 00:17:52.043 "data_offset": 2048, 00:17:52.043 "data_size": 63488 00:17:52.043 }, 00:17:52.043 { 00:17:52.043 "name": "pt2", 00:17:52.043 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:52.043 "is_configured": true, 00:17:52.043 "data_offset": 2048, 00:17:52.043 "data_size": 63488 00:17:52.043 }, 00:17:52.043 { 00:17:52.043 "name": "pt3", 00:17:52.043 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:52.043 "is_configured": true, 00:17:52.043 "data_offset": 2048, 00:17:52.043 "data_size": 63488 00:17:52.043 } 00:17:52.043 ] 00:17:52.043 }' 00:17:52.043 05:46:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.043 05:46:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.610 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:52.610 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:52.610 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:52.610 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:52.610 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:52.610 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:52.610 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:52.610 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:52.870 [2024-07-26 05:46:07.559798] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:52.870 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:52.870 "name": "raid_bdev1", 00:17:52.870 "aliases": [ 00:17:52.870 "053894e1-98d0-43aa-9840-88070edc11d6" 00:17:52.870 ], 00:17:52.870 "product_name": "Raid Volume", 00:17:52.870 "block_size": 512, 00:17:52.870 "num_blocks": 190464, 00:17:52.870 "uuid": "053894e1-98d0-43aa-9840-88070edc11d6", 00:17:52.870 "assigned_rate_limits": { 00:17:52.870 "rw_ios_per_sec": 0, 00:17:52.870 "rw_mbytes_per_sec": 0, 00:17:52.870 "r_mbytes_per_sec": 0, 00:17:52.870 "w_mbytes_per_sec": 0 00:17:52.870 }, 00:17:52.870 "claimed": false, 00:17:52.870 "zoned": false, 00:17:52.870 "supported_io_types": { 00:17:52.870 "read": true, 00:17:52.870 "write": true, 00:17:52.870 "unmap": true, 00:17:52.870 "flush": true, 00:17:52.870 "reset": true, 00:17:52.870 "nvme_admin": false, 00:17:52.870 "nvme_io": false, 00:17:52.870 "nvme_io_md": false, 00:17:52.870 "write_zeroes": true, 00:17:52.870 "zcopy": false, 00:17:52.870 "get_zone_info": false, 00:17:52.870 "zone_management": false, 00:17:52.870 "zone_append": false, 00:17:52.870 "compare": false, 00:17:52.870 "compare_and_write": false, 00:17:52.870 "abort": false, 00:17:52.870 "seek_hole": false, 00:17:52.870 "seek_data": false, 00:17:52.870 "copy": false, 00:17:52.870 "nvme_iov_md": false 00:17:52.870 }, 00:17:52.870 "memory_domains": [ 00:17:52.870 { 00:17:52.870 "dma_device_id": "system", 00:17:52.870 "dma_device_type": 1 00:17:52.870 }, 00:17:52.870 { 00:17:52.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.870 "dma_device_type": 2 00:17:52.870 }, 00:17:52.870 { 00:17:52.870 "dma_device_id": "system", 00:17:52.870 "dma_device_type": 1 00:17:52.870 }, 00:17:52.870 { 00:17:52.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.870 "dma_device_type": 2 00:17:52.870 }, 00:17:52.870 { 00:17:52.870 "dma_device_id": "system", 00:17:52.870 "dma_device_type": 1 00:17:52.870 }, 00:17:52.870 { 00:17:52.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.870 "dma_device_type": 2 00:17:52.870 } 00:17:52.870 ], 00:17:52.870 "driver_specific": { 00:17:52.870 "raid": { 00:17:52.870 "uuid": "053894e1-98d0-43aa-9840-88070edc11d6", 00:17:52.870 "strip_size_kb": 64, 00:17:52.870 "state": "online", 00:17:52.870 "raid_level": "concat", 00:17:52.870 "superblock": true, 00:17:52.870 "num_base_bdevs": 3, 00:17:52.870 "num_base_bdevs_discovered": 3, 00:17:52.870 "num_base_bdevs_operational": 3, 00:17:52.870 "base_bdevs_list": [ 00:17:52.870 { 00:17:52.870 "name": "pt1", 00:17:52.870 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:52.870 "is_configured": true, 00:17:52.870 "data_offset": 2048, 00:17:52.870 "data_size": 63488 00:17:52.870 }, 00:17:52.870 { 00:17:52.870 "name": "pt2", 00:17:52.870 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:52.870 "is_configured": true, 00:17:52.870 "data_offset": 2048, 00:17:52.870 "data_size": 63488 00:17:52.870 }, 00:17:52.870 { 00:17:52.870 "name": "pt3", 00:17:52.870 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:52.870 "is_configured": true, 00:17:52.870 "data_offset": 2048, 00:17:52.870 "data_size": 63488 00:17:52.870 } 00:17:52.870 ] 00:17:52.870 } 00:17:52.870 } 00:17:52.870 }' 00:17:52.870 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:52.870 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:52.870 pt2 00:17:52.870 pt3' 00:17:52.870 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:52.870 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:52.870 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:53.129 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:53.129 "name": "pt1", 00:17:53.129 "aliases": [ 00:17:53.129 "00000000-0000-0000-0000-000000000001" 00:17:53.129 ], 00:17:53.129 "product_name": "passthru", 00:17:53.129 "block_size": 512, 00:17:53.129 "num_blocks": 65536, 00:17:53.129 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:53.129 "assigned_rate_limits": { 00:17:53.129 "rw_ios_per_sec": 0, 00:17:53.129 "rw_mbytes_per_sec": 0, 00:17:53.129 "r_mbytes_per_sec": 0, 00:17:53.129 "w_mbytes_per_sec": 0 00:17:53.129 }, 00:17:53.129 "claimed": true, 00:17:53.129 "claim_type": "exclusive_write", 00:17:53.129 "zoned": false, 00:17:53.129 "supported_io_types": { 00:17:53.129 "read": true, 00:17:53.129 "write": true, 00:17:53.129 "unmap": true, 00:17:53.129 "flush": true, 00:17:53.129 "reset": true, 00:17:53.129 "nvme_admin": false, 00:17:53.129 "nvme_io": false, 00:17:53.129 "nvme_io_md": false, 00:17:53.129 "write_zeroes": true, 00:17:53.129 "zcopy": true, 00:17:53.129 "get_zone_info": false, 00:17:53.129 "zone_management": false, 00:17:53.129 "zone_append": false, 00:17:53.129 "compare": false, 00:17:53.129 "compare_and_write": false, 00:17:53.129 "abort": true, 00:17:53.129 "seek_hole": false, 00:17:53.129 "seek_data": false, 00:17:53.129 "copy": true, 00:17:53.129 "nvme_iov_md": false 00:17:53.129 }, 00:17:53.129 "memory_domains": [ 00:17:53.129 { 00:17:53.129 "dma_device_id": "system", 00:17:53.129 "dma_device_type": 1 00:17:53.129 }, 00:17:53.129 { 00:17:53.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.129 "dma_device_type": 2 00:17:53.129 } 00:17:53.129 ], 00:17:53.129 "driver_specific": { 00:17:53.129 "passthru": { 00:17:53.129 "name": "pt1", 00:17:53.129 "base_bdev_name": "malloc1" 00:17:53.129 } 00:17:53.129 } 00:17:53.129 }' 00:17:53.129 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.129 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.129 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:53.129 05:46:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.129 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:53.389 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:53.648 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:53.648 "name": "pt2", 00:17:53.648 "aliases": [ 00:17:53.648 "00000000-0000-0000-0000-000000000002" 00:17:53.648 ], 00:17:53.648 "product_name": "passthru", 00:17:53.648 "block_size": 512, 00:17:53.648 "num_blocks": 65536, 00:17:53.648 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:53.648 "assigned_rate_limits": { 00:17:53.648 "rw_ios_per_sec": 0, 00:17:53.648 "rw_mbytes_per_sec": 0, 00:17:53.648 "r_mbytes_per_sec": 0, 00:17:53.648 "w_mbytes_per_sec": 0 00:17:53.648 }, 00:17:53.648 "claimed": true, 00:17:53.648 "claim_type": "exclusive_write", 00:17:53.648 "zoned": false, 00:17:53.648 "supported_io_types": { 00:17:53.648 "read": true, 00:17:53.648 "write": true, 00:17:53.648 "unmap": true, 00:17:53.648 "flush": true, 00:17:53.648 "reset": true, 00:17:53.648 "nvme_admin": false, 00:17:53.648 "nvme_io": false, 00:17:53.648 "nvme_io_md": false, 00:17:53.648 "write_zeroes": true, 00:17:53.648 "zcopy": true, 00:17:53.648 "get_zone_info": false, 00:17:53.648 "zone_management": false, 00:17:53.648 "zone_append": false, 00:17:53.648 "compare": false, 00:17:53.648 "compare_and_write": false, 00:17:53.648 "abort": true, 00:17:53.648 "seek_hole": false, 00:17:53.648 "seek_data": false, 00:17:53.648 "copy": true, 00:17:53.648 "nvme_iov_md": false 00:17:53.648 }, 00:17:53.648 "memory_domains": [ 00:17:53.648 { 00:17:53.648 "dma_device_id": "system", 00:17:53.648 "dma_device_type": 1 00:17:53.648 }, 00:17:53.648 { 00:17:53.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.648 "dma_device_type": 2 00:17:53.648 } 00:17:53.648 ], 00:17:53.648 "driver_specific": { 00:17:53.648 "passthru": { 00:17:53.648 "name": "pt2", 00:17:53.648 "base_bdev_name": "malloc2" 00:17:53.648 } 00:17:53.648 } 00:17:53.648 }' 00:17:53.648 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.648 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:53.648 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:53.648 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:53.906 05:46:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:54.165 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:54.165 "name": "pt3", 00:17:54.165 "aliases": [ 00:17:54.165 "00000000-0000-0000-0000-000000000003" 00:17:54.165 ], 00:17:54.165 "product_name": "passthru", 00:17:54.165 "block_size": 512, 00:17:54.165 "num_blocks": 65536, 00:17:54.165 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:54.165 "assigned_rate_limits": { 00:17:54.165 "rw_ios_per_sec": 0, 00:17:54.165 "rw_mbytes_per_sec": 0, 00:17:54.165 "r_mbytes_per_sec": 0, 00:17:54.165 "w_mbytes_per_sec": 0 00:17:54.165 }, 00:17:54.165 "claimed": true, 00:17:54.165 "claim_type": "exclusive_write", 00:17:54.165 "zoned": false, 00:17:54.165 "supported_io_types": { 00:17:54.165 "read": true, 00:17:54.165 "write": true, 00:17:54.165 "unmap": true, 00:17:54.165 "flush": true, 00:17:54.165 "reset": true, 00:17:54.165 "nvme_admin": false, 00:17:54.165 "nvme_io": false, 00:17:54.165 "nvme_io_md": false, 00:17:54.165 "write_zeroes": true, 00:17:54.165 "zcopy": true, 00:17:54.165 "get_zone_info": false, 00:17:54.165 "zone_management": false, 00:17:54.165 "zone_append": false, 00:17:54.165 "compare": false, 00:17:54.165 "compare_and_write": false, 00:17:54.165 "abort": true, 00:17:54.165 "seek_hole": false, 00:17:54.165 "seek_data": false, 00:17:54.165 "copy": true, 00:17:54.165 "nvme_iov_md": false 00:17:54.165 }, 00:17:54.165 "memory_domains": [ 00:17:54.165 { 00:17:54.165 "dma_device_id": "system", 00:17:54.165 "dma_device_type": 1 00:17:54.165 }, 00:17:54.165 { 00:17:54.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.165 "dma_device_type": 2 00:17:54.165 } 00:17:54.165 ], 00:17:54.165 "driver_specific": { 00:17:54.165 "passthru": { 00:17:54.165 "name": "pt3", 00:17:54.165 "base_bdev_name": "malloc3" 00:17:54.165 } 00:17:54.165 } 00:17:54.165 }' 00:17:54.165 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.424 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:54.424 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:54.424 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.424 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:54.424 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:54.424 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.424 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:54.424 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:54.424 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.682 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:54.682 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:54.682 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:54.682 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:54.941 [2024-07-26 05:46:09.609299] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 053894e1-98d0-43aa-9840-88070edc11d6 '!=' 053894e1-98d0-43aa-9840-88070edc11d6 ']' 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1169726 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1169726 ']' 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1169726 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1169726 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1169726' 00:17:54.941 killing process with pid 1169726 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1169726 00:17:54.941 [2024-07-26 05:46:09.681423] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:54.941 [2024-07-26 05:46:09.681478] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:54.941 [2024-07-26 05:46:09.681539] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:54.941 [2024-07-26 05:46:09.681551] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15bfc00 name raid_bdev1, state offline 00:17:54.941 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1169726 00:17:54.941 [2024-07-26 05:46:09.707746] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:55.200 05:46:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:55.200 00:17:55.200 real 0m14.061s 00:17:55.200 user 0m25.212s 00:17:55.200 sys 0m2.649s 00:17:55.200 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:55.200 05:46:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.200 ************************************ 00:17:55.200 END TEST raid_superblock_test 00:17:55.200 ************************************ 00:17:55.200 05:46:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:55.200 05:46:09 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:17:55.200 05:46:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:55.200 05:46:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:55.200 05:46:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:55.200 ************************************ 00:17:55.200 START TEST raid_read_error_test 00:17:55.200 ************************************ 00:17:55.200 05:46:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:17:55.200 05:46:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:17:55.200 05:46:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:55.200 05:46:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:55.200 05:46:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:55.200 05:46:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:55.200 05:46:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:55.200 05:46:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.R8qgGngSVD 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1171932 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1171932 /var/tmp/spdk-raid.sock 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1171932 ']' 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:55.200 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:55.200 05:46:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.200 [2024-07-26 05:46:10.075378] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:17:55.200 [2024-07-26 05:46:10.075445] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1171932 ] 00:17:55.458 [2024-07-26 05:46:10.204884] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:55.458 [2024-07-26 05:46:10.309822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:55.716 [2024-07-26 05:46:10.372107] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:55.716 [2024-07-26 05:46:10.372138] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:55.716 05:46:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:55.716 05:46:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:55.716 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:55.716 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:55.974 BaseBdev1_malloc 00:17:55.974 05:46:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:56.232 true 00:17:56.232 05:46:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:56.490 [2024-07-26 05:46:11.258515] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:56.490 [2024-07-26 05:46:11.258564] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:56.490 [2024-07-26 05:46:11.258586] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21940d0 00:17:56.490 [2024-07-26 05:46:11.258601] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:56.490 [2024-07-26 05:46:11.260498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:56.490 [2024-07-26 05:46:11.260529] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:56.490 BaseBdev1 00:17:56.490 05:46:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:56.490 05:46:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:56.748 BaseBdev2_malloc 00:17:56.748 05:46:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:57.006 true 00:17:57.006 05:46:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:57.264 [2024-07-26 05:46:11.980985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:57.264 [2024-07-26 05:46:11.981026] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:57.264 [2024-07-26 05:46:11.981045] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2198910 00:17:57.264 [2024-07-26 05:46:11.981058] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:57.264 [2024-07-26 05:46:11.982587] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:57.264 [2024-07-26 05:46:11.982614] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:57.264 BaseBdev2 00:17:57.264 05:46:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:57.264 05:46:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:57.522 BaseBdev3_malloc 00:17:57.522 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:57.780 true 00:17:57.780 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:58.039 [2024-07-26 05:46:12.696705] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:58.039 [2024-07-26 05:46:12.696748] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:58.039 [2024-07-26 05:46:12.696769] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x219abd0 00:17:58.039 [2024-07-26 05:46:12.696782] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:58.039 [2024-07-26 05:46:12.698369] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:58.039 [2024-07-26 05:46:12.698397] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:58.039 BaseBdev3 00:17:58.039 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:58.039 [2024-07-26 05:46:12.941385] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:58.039 [2024-07-26 05:46:12.942744] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:58.039 [2024-07-26 05:46:12.942814] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:58.039 [2024-07-26 05:46:12.943026] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x219c280 00:17:58.039 [2024-07-26 05:46:12.943038] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:58.039 [2024-07-26 05:46:12.943238] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x219be20 00:17:58.039 [2024-07-26 05:46:12.943387] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x219c280 00:17:58.039 [2024-07-26 05:46:12.943397] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x219c280 00:17:58.039 [2024-07-26 05:46:12.943501] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.297 05:46:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:58.555 05:46:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.555 "name": "raid_bdev1", 00:17:58.555 "uuid": "cd3a59c3-fc17-4684-956d-fdd765ca449d", 00:17:58.555 "strip_size_kb": 64, 00:17:58.555 "state": "online", 00:17:58.555 "raid_level": "concat", 00:17:58.555 "superblock": true, 00:17:58.555 "num_base_bdevs": 3, 00:17:58.555 "num_base_bdevs_discovered": 3, 00:17:58.555 "num_base_bdevs_operational": 3, 00:17:58.555 "base_bdevs_list": [ 00:17:58.555 { 00:17:58.555 "name": "BaseBdev1", 00:17:58.555 "uuid": "74dfbe14-3e32-5c2b-80d3-b3a1e1221e21", 00:17:58.555 "is_configured": true, 00:17:58.555 "data_offset": 2048, 00:17:58.555 "data_size": 63488 00:17:58.555 }, 00:17:58.555 { 00:17:58.555 "name": "BaseBdev2", 00:17:58.555 "uuid": "95513020-54d6-58c5-a2ec-c010bc42fa80", 00:17:58.555 "is_configured": true, 00:17:58.555 "data_offset": 2048, 00:17:58.555 "data_size": 63488 00:17:58.555 }, 00:17:58.555 { 00:17:58.555 "name": "BaseBdev3", 00:17:58.555 "uuid": "b2ddb2d3-a000-5499-b8c2-db24613dab6d", 00:17:58.555 "is_configured": true, 00:17:58.555 "data_offset": 2048, 00:17:58.555 "data_size": 63488 00:17:58.555 } 00:17:58.555 ] 00:17:58.555 }' 00:17:58.555 05:46:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.555 05:46:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.121 05:46:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:59.121 05:46:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:59.121 [2024-07-26 05:46:13.912227] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fea4d0 00:18:00.055 05:46:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.313 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:00.571 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.571 "name": "raid_bdev1", 00:18:00.571 "uuid": "cd3a59c3-fc17-4684-956d-fdd765ca449d", 00:18:00.571 "strip_size_kb": 64, 00:18:00.571 "state": "online", 00:18:00.571 "raid_level": "concat", 00:18:00.571 "superblock": true, 00:18:00.571 "num_base_bdevs": 3, 00:18:00.571 "num_base_bdevs_discovered": 3, 00:18:00.571 "num_base_bdevs_operational": 3, 00:18:00.571 "base_bdevs_list": [ 00:18:00.571 { 00:18:00.571 "name": "BaseBdev1", 00:18:00.571 "uuid": "74dfbe14-3e32-5c2b-80d3-b3a1e1221e21", 00:18:00.571 "is_configured": true, 00:18:00.571 "data_offset": 2048, 00:18:00.571 "data_size": 63488 00:18:00.571 }, 00:18:00.571 { 00:18:00.571 "name": "BaseBdev2", 00:18:00.571 "uuid": "95513020-54d6-58c5-a2ec-c010bc42fa80", 00:18:00.571 "is_configured": true, 00:18:00.571 "data_offset": 2048, 00:18:00.571 "data_size": 63488 00:18:00.571 }, 00:18:00.571 { 00:18:00.571 "name": "BaseBdev3", 00:18:00.571 "uuid": "b2ddb2d3-a000-5499-b8c2-db24613dab6d", 00:18:00.571 "is_configured": true, 00:18:00.571 "data_offset": 2048, 00:18:00.571 "data_size": 63488 00:18:00.571 } 00:18:00.571 ] 00:18:00.571 }' 00:18:00.571 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.571 05:46:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:01.138 05:46:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:01.396 [2024-07-26 05:46:16.113102] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:01.396 [2024-07-26 05:46:16.113140] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:01.396 [2024-07-26 05:46:16.116292] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:01.396 [2024-07-26 05:46:16.116330] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:01.396 [2024-07-26 05:46:16.116365] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:01.396 [2024-07-26 05:46:16.116376] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x219c280 name raid_bdev1, state offline 00:18:01.396 0 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1171932 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1171932 ']' 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1171932 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1171932 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1171932' 00:18:01.396 killing process with pid 1171932 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1171932 00:18:01.396 [2024-07-26 05:46:16.184769] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:01.396 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1171932 00:18:01.396 [2024-07-26 05:46:16.205944] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.R8qgGngSVD 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:18:01.655 00:18:01.655 real 0m6.452s 00:18:01.655 user 0m10.511s 00:18:01.655 sys 0m1.235s 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:01.655 05:46:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:01.655 ************************************ 00:18:01.655 END TEST raid_read_error_test 00:18:01.655 ************************************ 00:18:01.655 05:46:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:01.655 05:46:16 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:18:01.655 05:46:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:01.655 05:46:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:01.655 05:46:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:01.655 ************************************ 00:18:01.655 START TEST raid_write_error_test 00:18:01.655 ************************************ 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.vPbooVlGwI 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1172904 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1172904 /var/tmp/spdk-raid.sock 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1172904 ']' 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:01.655 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:01.655 05:46:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:01.913 [2024-07-26 05:46:16.607213] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:18:01.913 [2024-07-26 05:46:16.607263] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1172904 ] 00:18:01.913 [2024-07-26 05:46:16.717592] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:01.913 [2024-07-26 05:46:16.821420] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:02.171 [2024-07-26 05:46:16.880974] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:02.171 [2024-07-26 05:46:16.881030] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:02.738 05:46:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:02.738 05:46:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:02.738 05:46:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:02.738 05:46:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:02.996 BaseBdev1_malloc 00:18:02.996 05:46:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:03.254 true 00:18:03.254 05:46:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:03.511 [2024-07-26 05:46:18.269242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:03.511 [2024-07-26 05:46:18.269287] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:03.511 [2024-07-26 05:46:18.269308] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x131a0d0 00:18:03.511 [2024-07-26 05:46:18.269325] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:03.511 [2024-07-26 05:46:18.271059] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:03.511 [2024-07-26 05:46:18.271089] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:03.511 BaseBdev1 00:18:03.511 05:46:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:03.511 05:46:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:03.767 BaseBdev2_malloc 00:18:03.767 05:46:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:04.024 true 00:18:04.024 05:46:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:04.282 [2024-07-26 05:46:19.004035] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:04.282 [2024-07-26 05:46:19.004079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:04.282 [2024-07-26 05:46:19.004101] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x131e910 00:18:04.282 [2024-07-26 05:46:19.004114] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:04.282 [2024-07-26 05:46:19.005600] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:04.282 [2024-07-26 05:46:19.005629] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:04.282 BaseBdev2 00:18:04.282 05:46:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:04.282 05:46:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:04.539 BaseBdev3_malloc 00:18:04.539 05:46:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:04.829 true 00:18:04.829 05:46:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:05.087 [2024-07-26 05:46:19.746594] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:05.087 [2024-07-26 05:46:19.746643] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:05.087 [2024-07-26 05:46:19.746662] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1320bd0 00:18:05.087 [2024-07-26 05:46:19.746675] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:05.087 [2024-07-26 05:46:19.748062] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:05.087 [2024-07-26 05:46:19.748090] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:05.087 BaseBdev3 00:18:05.087 05:46:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:05.087 [2024-07-26 05:46:19.991280] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:05.087 [2024-07-26 05:46:19.992482] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:05.087 [2024-07-26 05:46:19.992551] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:05.087 [2024-07-26 05:46:19.992762] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1322280 00:18:05.087 [2024-07-26 05:46:19.992774] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:18:05.087 [2024-07-26 05:46:19.992956] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1321e20 00:18:05.087 [2024-07-26 05:46:19.993097] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1322280 00:18:05.087 [2024-07-26 05:46:19.993112] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1322280 00:18:05.087 [2024-07-26 05:46:19.993207] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.346 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:05.604 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.604 "name": "raid_bdev1", 00:18:05.604 "uuid": "ae9e617f-ccb3-4e24-90b7-8bfb799f8245", 00:18:05.604 "strip_size_kb": 64, 00:18:05.604 "state": "online", 00:18:05.604 "raid_level": "concat", 00:18:05.604 "superblock": true, 00:18:05.604 "num_base_bdevs": 3, 00:18:05.604 "num_base_bdevs_discovered": 3, 00:18:05.604 "num_base_bdevs_operational": 3, 00:18:05.604 "base_bdevs_list": [ 00:18:05.604 { 00:18:05.604 "name": "BaseBdev1", 00:18:05.604 "uuid": "9591746d-c015-5542-a80c-b3085fd10242", 00:18:05.604 "is_configured": true, 00:18:05.604 "data_offset": 2048, 00:18:05.604 "data_size": 63488 00:18:05.604 }, 00:18:05.604 { 00:18:05.604 "name": "BaseBdev2", 00:18:05.604 "uuid": "db0fe498-788d-5067-a1e8-353205708f9b", 00:18:05.604 "is_configured": true, 00:18:05.604 "data_offset": 2048, 00:18:05.604 "data_size": 63488 00:18:05.604 }, 00:18:05.604 { 00:18:05.604 "name": "BaseBdev3", 00:18:05.604 "uuid": "990edfca-26cf-5a6a-b706-41ca429b12c6", 00:18:05.604 "is_configured": true, 00:18:05.604 "data_offset": 2048, 00:18:05.604 "data_size": 63488 00:18:05.604 } 00:18:05.604 ] 00:18:05.604 }' 00:18:05.604 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.604 05:46:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.170 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:06.170 05:46:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:06.170 [2024-07-26 05:46:20.970273] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11704d0 00:18:07.109 05:46:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.367 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:07.626 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.626 "name": "raid_bdev1", 00:18:07.626 "uuid": "ae9e617f-ccb3-4e24-90b7-8bfb799f8245", 00:18:07.626 "strip_size_kb": 64, 00:18:07.626 "state": "online", 00:18:07.626 "raid_level": "concat", 00:18:07.626 "superblock": true, 00:18:07.626 "num_base_bdevs": 3, 00:18:07.626 "num_base_bdevs_discovered": 3, 00:18:07.626 "num_base_bdevs_operational": 3, 00:18:07.626 "base_bdevs_list": [ 00:18:07.626 { 00:18:07.626 "name": "BaseBdev1", 00:18:07.626 "uuid": "9591746d-c015-5542-a80c-b3085fd10242", 00:18:07.626 "is_configured": true, 00:18:07.626 "data_offset": 2048, 00:18:07.626 "data_size": 63488 00:18:07.626 }, 00:18:07.626 { 00:18:07.626 "name": "BaseBdev2", 00:18:07.626 "uuid": "db0fe498-788d-5067-a1e8-353205708f9b", 00:18:07.626 "is_configured": true, 00:18:07.626 "data_offset": 2048, 00:18:07.626 "data_size": 63488 00:18:07.626 }, 00:18:07.626 { 00:18:07.626 "name": "BaseBdev3", 00:18:07.626 "uuid": "990edfca-26cf-5a6a-b706-41ca429b12c6", 00:18:07.626 "is_configured": true, 00:18:07.626 "data_offset": 2048, 00:18:07.626 "data_size": 63488 00:18:07.626 } 00:18:07.626 ] 00:18:07.626 }' 00:18:07.626 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.626 05:46:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:08.192 05:46:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:08.451 [2024-07-26 05:46:23.110176] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:08.451 [2024-07-26 05:46:23.110208] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:08.451 [2024-07-26 05:46:23.113386] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:08.451 [2024-07-26 05:46:23.113427] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:08.451 [2024-07-26 05:46:23.113461] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:08.451 [2024-07-26 05:46:23.113473] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1322280 name raid_bdev1, state offline 00:18:08.451 0 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1172904 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1172904 ']' 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1172904 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1172904 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1172904' 00:18:08.451 killing process with pid 1172904 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1172904 00:18:08.451 [2024-07-26 05:46:23.178174] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:08.451 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1172904 00:18:08.451 [2024-07-26 05:46:23.199096] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.vPbooVlGwI 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:18:08.709 00:18:08.709 real 0m6.895s 00:18:08.709 user 0m10.934s 00:18:08.709 sys 0m1.191s 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:08.709 05:46:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:08.709 ************************************ 00:18:08.709 END TEST raid_write_error_test 00:18:08.709 ************************************ 00:18:08.709 05:46:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:08.709 05:46:23 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:18:08.709 05:46:23 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:18:08.709 05:46:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:08.709 05:46:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:08.709 05:46:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:08.709 ************************************ 00:18:08.709 START TEST raid_state_function_test 00:18:08.709 ************************************ 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1173887 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1173887' 00:18:08.709 Process raid pid: 1173887 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1173887 /var/tmp/spdk-raid.sock 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1173887 ']' 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:08.709 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:08.709 05:46:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:08.709 [2024-07-26 05:46:23.571436] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:18:08.709 [2024-07-26 05:46:23.571499] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:08.967 [2024-07-26 05:46:23.702287] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:08.967 [2024-07-26 05:46:23.804766] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:08.967 [2024-07-26 05:46:23.863402] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:08.967 [2024-07-26 05:46:23.863438] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:09.902 05:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:09.902 05:46:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:18:09.902 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:10.161 [2024-07-26 05:46:24.980681] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:10.161 [2024-07-26 05:46:24.980721] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:10.161 [2024-07-26 05:46:24.980732] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:10.161 [2024-07-26 05:46:24.980745] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:10.161 [2024-07-26 05:46:24.980753] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:10.161 [2024-07-26 05:46:24.980765] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.161 05:46:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.161 05:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.418 05:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.418 "name": "Existed_Raid", 00:18:10.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.418 "strip_size_kb": 0, 00:18:10.418 "state": "configuring", 00:18:10.418 "raid_level": "raid1", 00:18:10.418 "superblock": false, 00:18:10.418 "num_base_bdevs": 3, 00:18:10.418 "num_base_bdevs_discovered": 0, 00:18:10.418 "num_base_bdevs_operational": 3, 00:18:10.418 "base_bdevs_list": [ 00:18:10.418 { 00:18:10.418 "name": "BaseBdev1", 00:18:10.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.418 "is_configured": false, 00:18:10.418 "data_offset": 0, 00:18:10.418 "data_size": 0 00:18:10.418 }, 00:18:10.418 { 00:18:10.418 "name": "BaseBdev2", 00:18:10.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.419 "is_configured": false, 00:18:10.419 "data_offset": 0, 00:18:10.419 "data_size": 0 00:18:10.419 }, 00:18:10.419 { 00:18:10.419 "name": "BaseBdev3", 00:18:10.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.419 "is_configured": false, 00:18:10.419 "data_offset": 0, 00:18:10.419 "data_size": 0 00:18:10.419 } 00:18:10.419 ] 00:18:10.419 }' 00:18:10.419 05:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.419 05:46:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:10.985 05:46:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:11.244 [2024-07-26 05:46:26.067427] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:11.244 [2024-07-26 05:46:26.067460] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x113fa80 name Existed_Raid, state configuring 00:18:11.244 05:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:11.502 [2024-07-26 05:46:26.308071] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:11.502 [2024-07-26 05:46:26.308101] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:11.502 [2024-07-26 05:46:26.308110] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:11.502 [2024-07-26 05:46:26.308122] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:11.502 [2024-07-26 05:46:26.308130] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:11.502 [2024-07-26 05:46:26.308141] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:11.502 05:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:11.760 [2024-07-26 05:46:26.551830] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:11.760 BaseBdev1 00:18:11.760 05:46:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:11.760 05:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:11.760 05:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:11.760 05:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:11.760 05:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:11.760 05:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:11.760 05:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:12.019 05:46:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:12.277 [ 00:18:12.277 { 00:18:12.278 "name": "BaseBdev1", 00:18:12.278 "aliases": [ 00:18:12.278 "b4efe16c-e474-4765-906d-af11bfaebb09" 00:18:12.278 ], 00:18:12.278 "product_name": "Malloc disk", 00:18:12.278 "block_size": 512, 00:18:12.278 "num_blocks": 65536, 00:18:12.278 "uuid": "b4efe16c-e474-4765-906d-af11bfaebb09", 00:18:12.278 "assigned_rate_limits": { 00:18:12.278 "rw_ios_per_sec": 0, 00:18:12.278 "rw_mbytes_per_sec": 0, 00:18:12.278 "r_mbytes_per_sec": 0, 00:18:12.278 "w_mbytes_per_sec": 0 00:18:12.278 }, 00:18:12.278 "claimed": true, 00:18:12.278 "claim_type": "exclusive_write", 00:18:12.278 "zoned": false, 00:18:12.278 "supported_io_types": { 00:18:12.278 "read": true, 00:18:12.278 "write": true, 00:18:12.278 "unmap": true, 00:18:12.278 "flush": true, 00:18:12.278 "reset": true, 00:18:12.278 "nvme_admin": false, 00:18:12.278 "nvme_io": false, 00:18:12.278 "nvme_io_md": false, 00:18:12.278 "write_zeroes": true, 00:18:12.278 "zcopy": true, 00:18:12.278 "get_zone_info": false, 00:18:12.278 "zone_management": false, 00:18:12.278 "zone_append": false, 00:18:12.278 "compare": false, 00:18:12.278 "compare_and_write": false, 00:18:12.278 "abort": true, 00:18:12.278 "seek_hole": false, 00:18:12.278 "seek_data": false, 00:18:12.278 "copy": true, 00:18:12.278 "nvme_iov_md": false 00:18:12.278 }, 00:18:12.278 "memory_domains": [ 00:18:12.278 { 00:18:12.278 "dma_device_id": "system", 00:18:12.278 "dma_device_type": 1 00:18:12.278 }, 00:18:12.278 { 00:18:12.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.278 "dma_device_type": 2 00:18:12.278 } 00:18:12.278 ], 00:18:12.278 "driver_specific": {} 00:18:12.278 } 00:18:12.278 ] 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.278 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:12.536 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.536 "name": "Existed_Raid", 00:18:12.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.536 "strip_size_kb": 0, 00:18:12.536 "state": "configuring", 00:18:12.536 "raid_level": "raid1", 00:18:12.536 "superblock": false, 00:18:12.536 "num_base_bdevs": 3, 00:18:12.536 "num_base_bdevs_discovered": 1, 00:18:12.536 "num_base_bdevs_operational": 3, 00:18:12.536 "base_bdevs_list": [ 00:18:12.536 { 00:18:12.536 "name": "BaseBdev1", 00:18:12.536 "uuid": "b4efe16c-e474-4765-906d-af11bfaebb09", 00:18:12.536 "is_configured": true, 00:18:12.536 "data_offset": 0, 00:18:12.536 "data_size": 65536 00:18:12.536 }, 00:18:12.536 { 00:18:12.536 "name": "BaseBdev2", 00:18:12.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.536 "is_configured": false, 00:18:12.536 "data_offset": 0, 00:18:12.536 "data_size": 0 00:18:12.536 }, 00:18:12.536 { 00:18:12.536 "name": "BaseBdev3", 00:18:12.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.536 "is_configured": false, 00:18:12.536 "data_offset": 0, 00:18:12.536 "data_size": 0 00:18:12.536 } 00:18:12.536 ] 00:18:12.536 }' 00:18:12.536 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.536 05:46:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:13.103 05:46:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:13.362 [2024-07-26 05:46:28.039769] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:13.362 [2024-07-26 05:46:28.039808] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x113f310 name Existed_Raid, state configuring 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:13.362 [2024-07-26 05:46:28.216259] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:13.362 [2024-07-26 05:46:28.217696] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:13.362 [2024-07-26 05:46:28.217727] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:13.362 [2024-07-26 05:46:28.217737] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:13.362 [2024-07-26 05:46:28.217749] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.362 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.620 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.620 "name": "Existed_Raid", 00:18:13.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.620 "strip_size_kb": 0, 00:18:13.620 "state": "configuring", 00:18:13.620 "raid_level": "raid1", 00:18:13.620 "superblock": false, 00:18:13.620 "num_base_bdevs": 3, 00:18:13.620 "num_base_bdevs_discovered": 1, 00:18:13.620 "num_base_bdevs_operational": 3, 00:18:13.620 "base_bdevs_list": [ 00:18:13.620 { 00:18:13.620 "name": "BaseBdev1", 00:18:13.620 "uuid": "b4efe16c-e474-4765-906d-af11bfaebb09", 00:18:13.620 "is_configured": true, 00:18:13.620 "data_offset": 0, 00:18:13.620 "data_size": 65536 00:18:13.620 }, 00:18:13.620 { 00:18:13.620 "name": "BaseBdev2", 00:18:13.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.620 "is_configured": false, 00:18:13.620 "data_offset": 0, 00:18:13.620 "data_size": 0 00:18:13.620 }, 00:18:13.620 { 00:18:13.620 "name": "BaseBdev3", 00:18:13.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.620 "is_configured": false, 00:18:13.620 "data_offset": 0, 00:18:13.620 "data_size": 0 00:18:13.620 } 00:18:13.620 ] 00:18:13.620 }' 00:18:13.620 05:46:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.621 05:46:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:14.188 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:14.446 [2024-07-26 05:46:29.246326] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:14.446 BaseBdev2 00:18:14.446 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:14.446 05:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:14.446 05:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:14.446 05:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:14.446 05:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:14.446 05:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:14.446 05:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:14.705 [ 00:18:14.705 { 00:18:14.705 "name": "BaseBdev2", 00:18:14.705 "aliases": [ 00:18:14.705 "6f499cea-33c3-4f92-a95f-ca3b03409908" 00:18:14.705 ], 00:18:14.705 "product_name": "Malloc disk", 00:18:14.705 "block_size": 512, 00:18:14.705 "num_blocks": 65536, 00:18:14.705 "uuid": "6f499cea-33c3-4f92-a95f-ca3b03409908", 00:18:14.705 "assigned_rate_limits": { 00:18:14.705 "rw_ios_per_sec": 0, 00:18:14.705 "rw_mbytes_per_sec": 0, 00:18:14.705 "r_mbytes_per_sec": 0, 00:18:14.705 "w_mbytes_per_sec": 0 00:18:14.705 }, 00:18:14.705 "claimed": true, 00:18:14.705 "claim_type": "exclusive_write", 00:18:14.705 "zoned": false, 00:18:14.705 "supported_io_types": { 00:18:14.705 "read": true, 00:18:14.705 "write": true, 00:18:14.705 "unmap": true, 00:18:14.705 "flush": true, 00:18:14.705 "reset": true, 00:18:14.705 "nvme_admin": false, 00:18:14.705 "nvme_io": false, 00:18:14.705 "nvme_io_md": false, 00:18:14.705 "write_zeroes": true, 00:18:14.705 "zcopy": true, 00:18:14.705 "get_zone_info": false, 00:18:14.705 "zone_management": false, 00:18:14.705 "zone_append": false, 00:18:14.705 "compare": false, 00:18:14.705 "compare_and_write": false, 00:18:14.705 "abort": true, 00:18:14.705 "seek_hole": false, 00:18:14.705 "seek_data": false, 00:18:14.705 "copy": true, 00:18:14.705 "nvme_iov_md": false 00:18:14.705 }, 00:18:14.705 "memory_domains": [ 00:18:14.705 { 00:18:14.705 "dma_device_id": "system", 00:18:14.705 "dma_device_type": 1 00:18:14.705 }, 00:18:14.705 { 00:18:14.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.705 "dma_device_type": 2 00:18:14.705 } 00:18:14.705 ], 00:18:14.705 "driver_specific": {} 00:18:14.705 } 00:18:14.705 ] 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.705 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.963 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.963 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:14.963 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.963 "name": "Existed_Raid", 00:18:14.963 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:14.963 "strip_size_kb": 0, 00:18:14.963 "state": "configuring", 00:18:14.963 "raid_level": "raid1", 00:18:14.963 "superblock": false, 00:18:14.963 "num_base_bdevs": 3, 00:18:14.963 "num_base_bdevs_discovered": 2, 00:18:14.963 "num_base_bdevs_operational": 3, 00:18:14.963 "base_bdevs_list": [ 00:18:14.963 { 00:18:14.963 "name": "BaseBdev1", 00:18:14.963 "uuid": "b4efe16c-e474-4765-906d-af11bfaebb09", 00:18:14.963 "is_configured": true, 00:18:14.963 "data_offset": 0, 00:18:14.963 "data_size": 65536 00:18:14.963 }, 00:18:14.963 { 00:18:14.963 "name": "BaseBdev2", 00:18:14.963 "uuid": "6f499cea-33c3-4f92-a95f-ca3b03409908", 00:18:14.963 "is_configured": true, 00:18:14.963 "data_offset": 0, 00:18:14.963 "data_size": 65536 00:18:14.963 }, 00:18:14.963 { 00:18:14.963 "name": "BaseBdev3", 00:18:14.963 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:14.963 "is_configured": false, 00:18:14.963 "data_offset": 0, 00:18:14.963 "data_size": 0 00:18:14.963 } 00:18:14.963 ] 00:18:14.963 }' 00:18:14.963 05:46:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.963 05:46:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.897 05:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:15.897 [2024-07-26 05:46:30.701570] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:15.897 [2024-07-26 05:46:30.701608] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1140400 00:18:15.897 [2024-07-26 05:46:30.701617] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:15.897 [2024-07-26 05:46:30.701874] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x113fef0 00:18:15.897 [2024-07-26 05:46:30.701996] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1140400 00:18:15.897 [2024-07-26 05:46:30.702007] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1140400 00:18:15.897 [2024-07-26 05:46:30.702168] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:15.897 BaseBdev3 00:18:15.897 05:46:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:15.897 05:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:15.897 05:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:15.897 05:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:15.897 05:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:15.897 05:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:15.897 05:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:16.155 05:46:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:16.155 [ 00:18:16.155 { 00:18:16.155 "name": "BaseBdev3", 00:18:16.155 "aliases": [ 00:18:16.155 "da88df85-b0e6-4de2-85ff-17915fb61947" 00:18:16.155 ], 00:18:16.155 "product_name": "Malloc disk", 00:18:16.155 "block_size": 512, 00:18:16.155 "num_blocks": 65536, 00:18:16.155 "uuid": "da88df85-b0e6-4de2-85ff-17915fb61947", 00:18:16.155 "assigned_rate_limits": { 00:18:16.155 "rw_ios_per_sec": 0, 00:18:16.155 "rw_mbytes_per_sec": 0, 00:18:16.155 "r_mbytes_per_sec": 0, 00:18:16.155 "w_mbytes_per_sec": 0 00:18:16.155 }, 00:18:16.155 "claimed": true, 00:18:16.155 "claim_type": "exclusive_write", 00:18:16.155 "zoned": false, 00:18:16.155 "supported_io_types": { 00:18:16.155 "read": true, 00:18:16.155 "write": true, 00:18:16.155 "unmap": true, 00:18:16.155 "flush": true, 00:18:16.155 "reset": true, 00:18:16.156 "nvme_admin": false, 00:18:16.156 "nvme_io": false, 00:18:16.156 "nvme_io_md": false, 00:18:16.156 "write_zeroes": true, 00:18:16.156 "zcopy": true, 00:18:16.156 "get_zone_info": false, 00:18:16.156 "zone_management": false, 00:18:16.156 "zone_append": false, 00:18:16.156 "compare": false, 00:18:16.156 "compare_and_write": false, 00:18:16.156 "abort": true, 00:18:16.156 "seek_hole": false, 00:18:16.156 "seek_data": false, 00:18:16.156 "copy": true, 00:18:16.156 "nvme_iov_md": false 00:18:16.156 }, 00:18:16.156 "memory_domains": [ 00:18:16.156 { 00:18:16.156 "dma_device_id": "system", 00:18:16.156 "dma_device_type": 1 00:18:16.156 }, 00:18:16.156 { 00:18:16.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.156 "dma_device_type": 2 00:18:16.156 } 00:18:16.156 ], 00:18:16.156 "driver_specific": {} 00:18:16.156 } 00:18:16.156 ] 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.414 "name": "Existed_Raid", 00:18:16.414 "uuid": "b328a36f-5f5f-4780-8a9d-9f98867529be", 00:18:16.414 "strip_size_kb": 0, 00:18:16.414 "state": "online", 00:18:16.414 "raid_level": "raid1", 00:18:16.414 "superblock": false, 00:18:16.414 "num_base_bdevs": 3, 00:18:16.414 "num_base_bdevs_discovered": 3, 00:18:16.414 "num_base_bdevs_operational": 3, 00:18:16.414 "base_bdevs_list": [ 00:18:16.414 { 00:18:16.414 "name": "BaseBdev1", 00:18:16.414 "uuid": "b4efe16c-e474-4765-906d-af11bfaebb09", 00:18:16.414 "is_configured": true, 00:18:16.414 "data_offset": 0, 00:18:16.414 "data_size": 65536 00:18:16.414 }, 00:18:16.414 { 00:18:16.414 "name": "BaseBdev2", 00:18:16.414 "uuid": "6f499cea-33c3-4f92-a95f-ca3b03409908", 00:18:16.414 "is_configured": true, 00:18:16.414 "data_offset": 0, 00:18:16.414 "data_size": 65536 00:18:16.414 }, 00:18:16.414 { 00:18:16.414 "name": "BaseBdev3", 00:18:16.414 "uuid": "da88df85-b0e6-4de2-85ff-17915fb61947", 00:18:16.414 "is_configured": true, 00:18:16.414 "data_offset": 0, 00:18:16.414 "data_size": 65536 00:18:16.414 } 00:18:16.414 ] 00:18:16.414 }' 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.414 05:46:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:17.348 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:17.348 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:17.348 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:17.348 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:17.348 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:17.348 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:17.348 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:17.348 05:46:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:17.348 [2024-07-26 05:46:32.141697] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:17.348 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:17.348 "name": "Existed_Raid", 00:18:17.348 "aliases": [ 00:18:17.348 "b328a36f-5f5f-4780-8a9d-9f98867529be" 00:18:17.348 ], 00:18:17.348 "product_name": "Raid Volume", 00:18:17.348 "block_size": 512, 00:18:17.348 "num_blocks": 65536, 00:18:17.348 "uuid": "b328a36f-5f5f-4780-8a9d-9f98867529be", 00:18:17.348 "assigned_rate_limits": { 00:18:17.348 "rw_ios_per_sec": 0, 00:18:17.348 "rw_mbytes_per_sec": 0, 00:18:17.348 "r_mbytes_per_sec": 0, 00:18:17.348 "w_mbytes_per_sec": 0 00:18:17.348 }, 00:18:17.348 "claimed": false, 00:18:17.348 "zoned": false, 00:18:17.348 "supported_io_types": { 00:18:17.348 "read": true, 00:18:17.348 "write": true, 00:18:17.348 "unmap": false, 00:18:17.349 "flush": false, 00:18:17.349 "reset": true, 00:18:17.349 "nvme_admin": false, 00:18:17.349 "nvme_io": false, 00:18:17.349 "nvme_io_md": false, 00:18:17.349 "write_zeroes": true, 00:18:17.349 "zcopy": false, 00:18:17.349 "get_zone_info": false, 00:18:17.349 "zone_management": false, 00:18:17.349 "zone_append": false, 00:18:17.349 "compare": false, 00:18:17.349 "compare_and_write": false, 00:18:17.349 "abort": false, 00:18:17.349 "seek_hole": false, 00:18:17.349 "seek_data": false, 00:18:17.349 "copy": false, 00:18:17.349 "nvme_iov_md": false 00:18:17.349 }, 00:18:17.349 "memory_domains": [ 00:18:17.349 { 00:18:17.349 "dma_device_id": "system", 00:18:17.349 "dma_device_type": 1 00:18:17.349 }, 00:18:17.349 { 00:18:17.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.349 "dma_device_type": 2 00:18:17.349 }, 00:18:17.349 { 00:18:17.349 "dma_device_id": "system", 00:18:17.349 "dma_device_type": 1 00:18:17.349 }, 00:18:17.349 { 00:18:17.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.349 "dma_device_type": 2 00:18:17.349 }, 00:18:17.349 { 00:18:17.349 "dma_device_id": "system", 00:18:17.349 "dma_device_type": 1 00:18:17.349 }, 00:18:17.349 { 00:18:17.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.349 "dma_device_type": 2 00:18:17.349 } 00:18:17.349 ], 00:18:17.349 "driver_specific": { 00:18:17.349 "raid": { 00:18:17.349 "uuid": "b328a36f-5f5f-4780-8a9d-9f98867529be", 00:18:17.349 "strip_size_kb": 0, 00:18:17.349 "state": "online", 00:18:17.349 "raid_level": "raid1", 00:18:17.349 "superblock": false, 00:18:17.349 "num_base_bdevs": 3, 00:18:17.349 "num_base_bdevs_discovered": 3, 00:18:17.349 "num_base_bdevs_operational": 3, 00:18:17.349 "base_bdevs_list": [ 00:18:17.349 { 00:18:17.349 "name": "BaseBdev1", 00:18:17.349 "uuid": "b4efe16c-e474-4765-906d-af11bfaebb09", 00:18:17.349 "is_configured": true, 00:18:17.349 "data_offset": 0, 00:18:17.349 "data_size": 65536 00:18:17.349 }, 00:18:17.349 { 00:18:17.349 "name": "BaseBdev2", 00:18:17.349 "uuid": "6f499cea-33c3-4f92-a95f-ca3b03409908", 00:18:17.349 "is_configured": true, 00:18:17.349 "data_offset": 0, 00:18:17.349 "data_size": 65536 00:18:17.349 }, 00:18:17.349 { 00:18:17.349 "name": "BaseBdev3", 00:18:17.349 "uuid": "da88df85-b0e6-4de2-85ff-17915fb61947", 00:18:17.349 "is_configured": true, 00:18:17.349 "data_offset": 0, 00:18:17.349 "data_size": 65536 00:18:17.349 } 00:18:17.349 ] 00:18:17.349 } 00:18:17.349 } 00:18:17.349 }' 00:18:17.349 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:17.349 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:17.349 BaseBdev2 00:18:17.349 BaseBdev3' 00:18:17.349 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.349 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:17.349 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.606 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.606 "name": "BaseBdev1", 00:18:17.606 "aliases": [ 00:18:17.606 "b4efe16c-e474-4765-906d-af11bfaebb09" 00:18:17.606 ], 00:18:17.606 "product_name": "Malloc disk", 00:18:17.606 "block_size": 512, 00:18:17.606 "num_blocks": 65536, 00:18:17.606 "uuid": "b4efe16c-e474-4765-906d-af11bfaebb09", 00:18:17.606 "assigned_rate_limits": { 00:18:17.606 "rw_ios_per_sec": 0, 00:18:17.606 "rw_mbytes_per_sec": 0, 00:18:17.606 "r_mbytes_per_sec": 0, 00:18:17.606 "w_mbytes_per_sec": 0 00:18:17.606 }, 00:18:17.606 "claimed": true, 00:18:17.606 "claim_type": "exclusive_write", 00:18:17.606 "zoned": false, 00:18:17.606 "supported_io_types": { 00:18:17.606 "read": true, 00:18:17.606 "write": true, 00:18:17.606 "unmap": true, 00:18:17.606 "flush": true, 00:18:17.606 "reset": true, 00:18:17.606 "nvme_admin": false, 00:18:17.606 "nvme_io": false, 00:18:17.606 "nvme_io_md": false, 00:18:17.606 "write_zeroes": true, 00:18:17.606 "zcopy": true, 00:18:17.606 "get_zone_info": false, 00:18:17.606 "zone_management": false, 00:18:17.606 "zone_append": false, 00:18:17.606 "compare": false, 00:18:17.606 "compare_and_write": false, 00:18:17.606 "abort": true, 00:18:17.606 "seek_hole": false, 00:18:17.606 "seek_data": false, 00:18:17.606 "copy": true, 00:18:17.606 "nvme_iov_md": false 00:18:17.606 }, 00:18:17.606 "memory_domains": [ 00:18:17.606 { 00:18:17.606 "dma_device_id": "system", 00:18:17.606 "dma_device_type": 1 00:18:17.606 }, 00:18:17.606 { 00:18:17.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.607 "dma_device_type": 2 00:18:17.607 } 00:18:17.607 ], 00:18:17.607 "driver_specific": {} 00:18:17.607 }' 00:18:17.607 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.607 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.865 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.865 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.865 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.865 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.865 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.865 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:17.865 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:17.865 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.123 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.123 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.123 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:18.123 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:18.123 05:46:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:18.382 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.382 "name": "BaseBdev2", 00:18:18.382 "aliases": [ 00:18:18.382 "6f499cea-33c3-4f92-a95f-ca3b03409908" 00:18:18.382 ], 00:18:18.382 "product_name": "Malloc disk", 00:18:18.382 "block_size": 512, 00:18:18.382 "num_blocks": 65536, 00:18:18.382 "uuid": "6f499cea-33c3-4f92-a95f-ca3b03409908", 00:18:18.382 "assigned_rate_limits": { 00:18:18.382 "rw_ios_per_sec": 0, 00:18:18.382 "rw_mbytes_per_sec": 0, 00:18:18.382 "r_mbytes_per_sec": 0, 00:18:18.382 "w_mbytes_per_sec": 0 00:18:18.382 }, 00:18:18.382 "claimed": true, 00:18:18.382 "claim_type": "exclusive_write", 00:18:18.382 "zoned": false, 00:18:18.382 "supported_io_types": { 00:18:18.382 "read": true, 00:18:18.382 "write": true, 00:18:18.382 "unmap": true, 00:18:18.382 "flush": true, 00:18:18.382 "reset": true, 00:18:18.382 "nvme_admin": false, 00:18:18.382 "nvme_io": false, 00:18:18.382 "nvme_io_md": false, 00:18:18.382 "write_zeroes": true, 00:18:18.382 "zcopy": true, 00:18:18.382 "get_zone_info": false, 00:18:18.382 "zone_management": false, 00:18:18.382 "zone_append": false, 00:18:18.382 "compare": false, 00:18:18.382 "compare_and_write": false, 00:18:18.382 "abort": true, 00:18:18.382 "seek_hole": false, 00:18:18.382 "seek_data": false, 00:18:18.382 "copy": true, 00:18:18.382 "nvme_iov_md": false 00:18:18.382 }, 00:18:18.382 "memory_domains": [ 00:18:18.382 { 00:18:18.382 "dma_device_id": "system", 00:18:18.382 "dma_device_type": 1 00:18:18.382 }, 00:18:18.382 { 00:18:18.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.382 "dma_device_type": 2 00:18:18.382 } 00:18:18.382 ], 00:18:18.382 "driver_specific": {} 00:18:18.382 }' 00:18:18.382 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.382 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.382 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.382 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.382 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.382 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.382 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.661 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.661 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.661 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.661 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.661 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.661 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:18.661 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:18.661 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:18.929 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.929 "name": "BaseBdev3", 00:18:18.929 "aliases": [ 00:18:18.929 "da88df85-b0e6-4de2-85ff-17915fb61947" 00:18:18.929 ], 00:18:18.929 "product_name": "Malloc disk", 00:18:18.929 "block_size": 512, 00:18:18.929 "num_blocks": 65536, 00:18:18.929 "uuid": "da88df85-b0e6-4de2-85ff-17915fb61947", 00:18:18.929 "assigned_rate_limits": { 00:18:18.929 "rw_ios_per_sec": 0, 00:18:18.929 "rw_mbytes_per_sec": 0, 00:18:18.929 "r_mbytes_per_sec": 0, 00:18:18.929 "w_mbytes_per_sec": 0 00:18:18.929 }, 00:18:18.929 "claimed": true, 00:18:18.929 "claim_type": "exclusive_write", 00:18:18.929 "zoned": false, 00:18:18.929 "supported_io_types": { 00:18:18.929 "read": true, 00:18:18.929 "write": true, 00:18:18.929 "unmap": true, 00:18:18.930 "flush": true, 00:18:18.930 "reset": true, 00:18:18.930 "nvme_admin": false, 00:18:18.930 "nvme_io": false, 00:18:18.930 "nvme_io_md": false, 00:18:18.930 "write_zeroes": true, 00:18:18.930 "zcopy": true, 00:18:18.930 "get_zone_info": false, 00:18:18.930 "zone_management": false, 00:18:18.930 "zone_append": false, 00:18:18.930 "compare": false, 00:18:18.930 "compare_and_write": false, 00:18:18.930 "abort": true, 00:18:18.930 "seek_hole": false, 00:18:18.930 "seek_data": false, 00:18:18.930 "copy": true, 00:18:18.930 "nvme_iov_md": false 00:18:18.930 }, 00:18:18.930 "memory_domains": [ 00:18:18.930 { 00:18:18.930 "dma_device_id": "system", 00:18:18.930 "dma_device_type": 1 00:18:18.930 }, 00:18:18.930 { 00:18:18.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.930 "dma_device_type": 2 00:18:18.930 } 00:18:18.930 ], 00:18:18.930 "driver_specific": {} 00:18:18.930 }' 00:18:18.930 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.930 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.930 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.930 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.930 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.188 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:19.188 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.188 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.188 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.188 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.188 05:46:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.188 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.188 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:19.446 [2024-07-26 05:46:34.235009] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.446 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:19.704 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:19.704 "name": "Existed_Raid", 00:18:19.704 "uuid": "b328a36f-5f5f-4780-8a9d-9f98867529be", 00:18:19.704 "strip_size_kb": 0, 00:18:19.704 "state": "online", 00:18:19.704 "raid_level": "raid1", 00:18:19.704 "superblock": false, 00:18:19.704 "num_base_bdevs": 3, 00:18:19.704 "num_base_bdevs_discovered": 2, 00:18:19.704 "num_base_bdevs_operational": 2, 00:18:19.704 "base_bdevs_list": [ 00:18:19.704 { 00:18:19.704 "name": null, 00:18:19.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:19.704 "is_configured": false, 00:18:19.704 "data_offset": 0, 00:18:19.704 "data_size": 65536 00:18:19.704 }, 00:18:19.704 { 00:18:19.704 "name": "BaseBdev2", 00:18:19.704 "uuid": "6f499cea-33c3-4f92-a95f-ca3b03409908", 00:18:19.704 "is_configured": true, 00:18:19.704 "data_offset": 0, 00:18:19.704 "data_size": 65536 00:18:19.704 }, 00:18:19.704 { 00:18:19.704 "name": "BaseBdev3", 00:18:19.704 "uuid": "da88df85-b0e6-4de2-85ff-17915fb61947", 00:18:19.704 "is_configured": true, 00:18:19.704 "data_offset": 0, 00:18:19.704 "data_size": 65536 00:18:19.704 } 00:18:19.704 ] 00:18:19.704 }' 00:18:19.704 05:46:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:19.704 05:46:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.271 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:20.271 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:20.271 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.271 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:20.529 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:20.529 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:20.529 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:21.095 [2024-07-26 05:46:35.812183] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:21.095 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:21.095 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:21.095 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.095 05:46:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:21.354 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:21.354 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:21.354 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:21.924 [2024-07-26 05:46:36.576786] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:21.924 [2024-07-26 05:46:36.576873] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:21.924 [2024-07-26 05:46:36.589591] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:21.924 [2024-07-26 05:46:36.589626] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:21.924 [2024-07-26 05:46:36.589645] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1140400 name Existed_Raid, state offline 00:18:21.924 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:21.924 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:21.924 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.924 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:22.250 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:22.250 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:22.250 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:18:22.250 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:22.250 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:22.250 05:46:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:22.509 BaseBdev2 00:18:22.509 05:46:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:22.509 05:46:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:22.509 05:46:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:22.509 05:46:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:22.509 05:46:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:22.509 05:46:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:22.509 05:46:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:22.767 05:46:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:23.333 [ 00:18:23.333 { 00:18:23.333 "name": "BaseBdev2", 00:18:23.333 "aliases": [ 00:18:23.333 "3381a1ba-6558-4e0b-8f02-43805cae4d38" 00:18:23.333 ], 00:18:23.333 "product_name": "Malloc disk", 00:18:23.333 "block_size": 512, 00:18:23.333 "num_blocks": 65536, 00:18:23.333 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:23.333 "assigned_rate_limits": { 00:18:23.333 "rw_ios_per_sec": 0, 00:18:23.333 "rw_mbytes_per_sec": 0, 00:18:23.333 "r_mbytes_per_sec": 0, 00:18:23.333 "w_mbytes_per_sec": 0 00:18:23.333 }, 00:18:23.333 "claimed": false, 00:18:23.333 "zoned": false, 00:18:23.333 "supported_io_types": { 00:18:23.333 "read": true, 00:18:23.333 "write": true, 00:18:23.333 "unmap": true, 00:18:23.333 "flush": true, 00:18:23.333 "reset": true, 00:18:23.333 "nvme_admin": false, 00:18:23.333 "nvme_io": false, 00:18:23.333 "nvme_io_md": false, 00:18:23.333 "write_zeroes": true, 00:18:23.333 "zcopy": true, 00:18:23.333 "get_zone_info": false, 00:18:23.333 "zone_management": false, 00:18:23.333 "zone_append": false, 00:18:23.333 "compare": false, 00:18:23.333 "compare_and_write": false, 00:18:23.333 "abort": true, 00:18:23.333 "seek_hole": false, 00:18:23.333 "seek_data": false, 00:18:23.333 "copy": true, 00:18:23.333 "nvme_iov_md": false 00:18:23.333 }, 00:18:23.333 "memory_domains": [ 00:18:23.333 { 00:18:23.333 "dma_device_id": "system", 00:18:23.333 "dma_device_type": 1 00:18:23.333 }, 00:18:23.333 { 00:18:23.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.333 "dma_device_type": 2 00:18:23.333 } 00:18:23.333 ], 00:18:23.333 "driver_specific": {} 00:18:23.333 } 00:18:23.333 ] 00:18:23.333 05:46:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:23.333 05:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:23.333 05:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:23.333 05:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:23.592 BaseBdev3 00:18:23.592 05:46:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:23.592 05:46:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:23.592 05:46:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:23.592 05:46:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:23.592 05:46:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:23.592 05:46:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:23.592 05:46:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:24.159 05:46:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:24.418 [ 00:18:24.418 { 00:18:24.418 "name": "BaseBdev3", 00:18:24.418 "aliases": [ 00:18:24.418 "8974b646-345b-4af8-86db-8c3465523b23" 00:18:24.418 ], 00:18:24.418 "product_name": "Malloc disk", 00:18:24.418 "block_size": 512, 00:18:24.418 "num_blocks": 65536, 00:18:24.418 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:24.418 "assigned_rate_limits": { 00:18:24.418 "rw_ios_per_sec": 0, 00:18:24.418 "rw_mbytes_per_sec": 0, 00:18:24.418 "r_mbytes_per_sec": 0, 00:18:24.418 "w_mbytes_per_sec": 0 00:18:24.418 }, 00:18:24.418 "claimed": false, 00:18:24.418 "zoned": false, 00:18:24.418 "supported_io_types": { 00:18:24.418 "read": true, 00:18:24.418 "write": true, 00:18:24.418 "unmap": true, 00:18:24.418 "flush": true, 00:18:24.418 "reset": true, 00:18:24.418 "nvme_admin": false, 00:18:24.418 "nvme_io": false, 00:18:24.418 "nvme_io_md": false, 00:18:24.418 "write_zeroes": true, 00:18:24.418 "zcopy": true, 00:18:24.418 "get_zone_info": false, 00:18:24.418 "zone_management": false, 00:18:24.418 "zone_append": false, 00:18:24.418 "compare": false, 00:18:24.418 "compare_and_write": false, 00:18:24.418 "abort": true, 00:18:24.418 "seek_hole": false, 00:18:24.418 "seek_data": false, 00:18:24.418 "copy": true, 00:18:24.418 "nvme_iov_md": false 00:18:24.418 }, 00:18:24.418 "memory_domains": [ 00:18:24.418 { 00:18:24.418 "dma_device_id": "system", 00:18:24.418 "dma_device_type": 1 00:18:24.418 }, 00:18:24.418 { 00:18:24.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.418 "dma_device_type": 2 00:18:24.418 } 00:18:24.418 ], 00:18:24.418 "driver_specific": {} 00:18:24.418 } 00:18:24.418 ] 00:18:24.418 05:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:24.418 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:24.418 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:24.418 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:24.985 [2024-07-26 05:46:39.623166] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:24.985 [2024-07-26 05:46:39.623205] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:24.985 [2024-07-26 05:46:39.623224] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:24.985 [2024-07-26 05:46:39.624538] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.985 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.243 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.243 "name": "Existed_Raid", 00:18:25.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.243 "strip_size_kb": 0, 00:18:25.243 "state": "configuring", 00:18:25.243 "raid_level": "raid1", 00:18:25.243 "superblock": false, 00:18:25.243 "num_base_bdevs": 3, 00:18:25.243 "num_base_bdevs_discovered": 2, 00:18:25.243 "num_base_bdevs_operational": 3, 00:18:25.243 "base_bdevs_list": [ 00:18:25.243 { 00:18:25.243 "name": "BaseBdev1", 00:18:25.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.243 "is_configured": false, 00:18:25.243 "data_offset": 0, 00:18:25.243 "data_size": 0 00:18:25.243 }, 00:18:25.243 { 00:18:25.243 "name": "BaseBdev2", 00:18:25.243 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:25.243 "is_configured": true, 00:18:25.243 "data_offset": 0, 00:18:25.243 "data_size": 65536 00:18:25.243 }, 00:18:25.243 { 00:18:25.243 "name": "BaseBdev3", 00:18:25.243 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:25.243 "is_configured": true, 00:18:25.243 "data_offset": 0, 00:18:25.243 "data_size": 65536 00:18:25.243 } 00:18:25.243 ] 00:18:25.243 }' 00:18:25.243 05:46:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.243 05:46:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:25.810 [2024-07-26 05:46:40.661905] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.810 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.068 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.068 "name": "Existed_Raid", 00:18:26.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.068 "strip_size_kb": 0, 00:18:26.068 "state": "configuring", 00:18:26.068 "raid_level": "raid1", 00:18:26.068 "superblock": false, 00:18:26.068 "num_base_bdevs": 3, 00:18:26.068 "num_base_bdevs_discovered": 1, 00:18:26.068 "num_base_bdevs_operational": 3, 00:18:26.068 "base_bdevs_list": [ 00:18:26.068 { 00:18:26.068 "name": "BaseBdev1", 00:18:26.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.068 "is_configured": false, 00:18:26.068 "data_offset": 0, 00:18:26.068 "data_size": 0 00:18:26.068 }, 00:18:26.068 { 00:18:26.068 "name": null, 00:18:26.068 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:26.068 "is_configured": false, 00:18:26.068 "data_offset": 0, 00:18:26.068 "data_size": 65536 00:18:26.068 }, 00:18:26.068 { 00:18:26.068 "name": "BaseBdev3", 00:18:26.068 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:26.068 "is_configured": true, 00:18:26.068 "data_offset": 0, 00:18:26.068 "data_size": 65536 00:18:26.068 } 00:18:26.068 ] 00:18:26.068 }' 00:18:26.068 05:46:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.068 05:46:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:26.634 05:46:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.634 05:46:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:26.892 05:46:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:26.892 05:46:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:27.151 [2024-07-26 05:46:41.936666] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:27.151 BaseBdev1 00:18:27.151 05:46:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:27.151 05:46:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:27.151 05:46:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:27.151 05:46:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:27.151 05:46:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:27.151 05:46:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:27.151 05:46:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:27.409 05:46:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:27.667 [ 00:18:27.667 { 00:18:27.667 "name": "BaseBdev1", 00:18:27.667 "aliases": [ 00:18:27.667 "e3f290ae-785c-4f6d-97a3-831b33a2e21e" 00:18:27.667 ], 00:18:27.667 "product_name": "Malloc disk", 00:18:27.667 "block_size": 512, 00:18:27.667 "num_blocks": 65536, 00:18:27.667 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:27.667 "assigned_rate_limits": { 00:18:27.667 "rw_ios_per_sec": 0, 00:18:27.667 "rw_mbytes_per_sec": 0, 00:18:27.667 "r_mbytes_per_sec": 0, 00:18:27.667 "w_mbytes_per_sec": 0 00:18:27.667 }, 00:18:27.667 "claimed": true, 00:18:27.667 "claim_type": "exclusive_write", 00:18:27.667 "zoned": false, 00:18:27.667 "supported_io_types": { 00:18:27.667 "read": true, 00:18:27.667 "write": true, 00:18:27.667 "unmap": true, 00:18:27.667 "flush": true, 00:18:27.667 "reset": true, 00:18:27.667 "nvme_admin": false, 00:18:27.667 "nvme_io": false, 00:18:27.667 "nvme_io_md": false, 00:18:27.667 "write_zeroes": true, 00:18:27.667 "zcopy": true, 00:18:27.667 "get_zone_info": false, 00:18:27.667 "zone_management": false, 00:18:27.667 "zone_append": false, 00:18:27.667 "compare": false, 00:18:27.667 "compare_and_write": false, 00:18:27.667 "abort": true, 00:18:27.667 "seek_hole": false, 00:18:27.667 "seek_data": false, 00:18:27.667 "copy": true, 00:18:27.667 "nvme_iov_md": false 00:18:27.667 }, 00:18:27.667 "memory_domains": [ 00:18:27.667 { 00:18:27.667 "dma_device_id": "system", 00:18:27.667 "dma_device_type": 1 00:18:27.667 }, 00:18:27.667 { 00:18:27.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:27.667 "dma_device_type": 2 00:18:27.667 } 00:18:27.667 ], 00:18:27.667 "driver_specific": {} 00:18:27.667 } 00:18:27.667 ] 00:18:27.667 05:46:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:27.667 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:27.667 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.667 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.668 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:27.668 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:27.668 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:27.668 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.668 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.668 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.668 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.668 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.668 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.926 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.926 "name": "Existed_Raid", 00:18:27.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.926 "strip_size_kb": 0, 00:18:27.926 "state": "configuring", 00:18:27.926 "raid_level": "raid1", 00:18:27.926 "superblock": false, 00:18:27.926 "num_base_bdevs": 3, 00:18:27.926 "num_base_bdevs_discovered": 2, 00:18:27.926 "num_base_bdevs_operational": 3, 00:18:27.926 "base_bdevs_list": [ 00:18:27.926 { 00:18:27.926 "name": "BaseBdev1", 00:18:27.926 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:27.926 "is_configured": true, 00:18:27.926 "data_offset": 0, 00:18:27.926 "data_size": 65536 00:18:27.926 }, 00:18:27.926 { 00:18:27.926 "name": null, 00:18:27.926 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:27.926 "is_configured": false, 00:18:27.926 "data_offset": 0, 00:18:27.926 "data_size": 65536 00:18:27.926 }, 00:18:27.926 { 00:18:27.926 "name": "BaseBdev3", 00:18:27.926 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:27.926 "is_configured": true, 00:18:27.926 "data_offset": 0, 00:18:27.926 "data_size": 65536 00:18:27.926 } 00:18:27.926 ] 00:18:27.926 }' 00:18:27.926 05:46:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.926 05:46:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.493 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.493 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:28.751 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:28.751 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:29.010 [2024-07-26 05:46:43.773559] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.010 05:46:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.268 05:46:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.268 "name": "Existed_Raid", 00:18:29.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.268 "strip_size_kb": 0, 00:18:29.268 "state": "configuring", 00:18:29.268 "raid_level": "raid1", 00:18:29.268 "superblock": false, 00:18:29.268 "num_base_bdevs": 3, 00:18:29.268 "num_base_bdevs_discovered": 1, 00:18:29.268 "num_base_bdevs_operational": 3, 00:18:29.268 "base_bdevs_list": [ 00:18:29.268 { 00:18:29.268 "name": "BaseBdev1", 00:18:29.268 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:29.268 "is_configured": true, 00:18:29.268 "data_offset": 0, 00:18:29.268 "data_size": 65536 00:18:29.268 }, 00:18:29.268 { 00:18:29.268 "name": null, 00:18:29.268 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:29.268 "is_configured": false, 00:18:29.268 "data_offset": 0, 00:18:29.268 "data_size": 65536 00:18:29.268 }, 00:18:29.268 { 00:18:29.268 "name": null, 00:18:29.268 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:29.268 "is_configured": false, 00:18:29.268 "data_offset": 0, 00:18:29.268 "data_size": 65536 00:18:29.268 } 00:18:29.268 ] 00:18:29.268 }' 00:18:29.268 05:46:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.268 05:46:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:29.835 05:46:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.835 05:46:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:30.094 05:46:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:30.094 05:46:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:30.353 [2024-07-26 05:46:45.109119] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.353 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.611 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.611 "name": "Existed_Raid", 00:18:30.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.611 "strip_size_kb": 0, 00:18:30.611 "state": "configuring", 00:18:30.611 "raid_level": "raid1", 00:18:30.612 "superblock": false, 00:18:30.612 "num_base_bdevs": 3, 00:18:30.612 "num_base_bdevs_discovered": 2, 00:18:30.612 "num_base_bdevs_operational": 3, 00:18:30.612 "base_bdevs_list": [ 00:18:30.612 { 00:18:30.612 "name": "BaseBdev1", 00:18:30.612 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:30.612 "is_configured": true, 00:18:30.612 "data_offset": 0, 00:18:30.612 "data_size": 65536 00:18:30.612 }, 00:18:30.612 { 00:18:30.612 "name": null, 00:18:30.612 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:30.612 "is_configured": false, 00:18:30.612 "data_offset": 0, 00:18:30.612 "data_size": 65536 00:18:30.612 }, 00:18:30.612 { 00:18:30.612 "name": "BaseBdev3", 00:18:30.612 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:30.612 "is_configured": true, 00:18:30.612 "data_offset": 0, 00:18:30.612 "data_size": 65536 00:18:30.612 } 00:18:30.612 ] 00:18:30.612 }' 00:18:30.612 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.612 05:46:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.177 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:31.177 05:46:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.434 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:31.434 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:31.692 [2024-07-26 05:46:46.452715] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:31.692 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:31.692 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:31.692 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:31.692 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:31.692 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:31.692 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:31.692 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.693 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.693 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.693 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.693 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.693 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:31.951 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.951 "name": "Existed_Raid", 00:18:31.951 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:31.951 "strip_size_kb": 0, 00:18:31.951 "state": "configuring", 00:18:31.951 "raid_level": "raid1", 00:18:31.951 "superblock": false, 00:18:31.951 "num_base_bdevs": 3, 00:18:31.951 "num_base_bdevs_discovered": 1, 00:18:31.951 "num_base_bdevs_operational": 3, 00:18:31.951 "base_bdevs_list": [ 00:18:31.951 { 00:18:31.951 "name": null, 00:18:31.951 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:31.951 "is_configured": false, 00:18:31.951 "data_offset": 0, 00:18:31.951 "data_size": 65536 00:18:31.951 }, 00:18:31.951 { 00:18:31.951 "name": null, 00:18:31.951 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:31.951 "is_configured": false, 00:18:31.951 "data_offset": 0, 00:18:31.951 "data_size": 65536 00:18:31.951 }, 00:18:31.951 { 00:18:31.951 "name": "BaseBdev3", 00:18:31.951 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:31.951 "is_configured": true, 00:18:31.951 "data_offset": 0, 00:18:31.951 "data_size": 65536 00:18:31.951 } 00:18:31.951 ] 00:18:31.951 }' 00:18:31.951 05:46:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.951 05:46:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:32.517 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.517 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:32.776 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:32.776 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:33.035 [2024-07-26 05:46:47.800565] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.035 05:46:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.293 05:46:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.293 "name": "Existed_Raid", 00:18:33.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.293 "strip_size_kb": 0, 00:18:33.293 "state": "configuring", 00:18:33.293 "raid_level": "raid1", 00:18:33.293 "superblock": false, 00:18:33.294 "num_base_bdevs": 3, 00:18:33.294 "num_base_bdevs_discovered": 2, 00:18:33.294 "num_base_bdevs_operational": 3, 00:18:33.294 "base_bdevs_list": [ 00:18:33.294 { 00:18:33.294 "name": null, 00:18:33.294 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:33.294 "is_configured": false, 00:18:33.294 "data_offset": 0, 00:18:33.294 "data_size": 65536 00:18:33.294 }, 00:18:33.294 { 00:18:33.294 "name": "BaseBdev2", 00:18:33.294 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:33.294 "is_configured": true, 00:18:33.294 "data_offset": 0, 00:18:33.294 "data_size": 65536 00:18:33.294 }, 00:18:33.294 { 00:18:33.294 "name": "BaseBdev3", 00:18:33.294 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:33.294 "is_configured": true, 00:18:33.294 "data_offset": 0, 00:18:33.294 "data_size": 65536 00:18:33.294 } 00:18:33.294 ] 00:18:33.294 }' 00:18:33.294 05:46:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.294 05:46:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:33.861 05:46:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.861 05:46:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:34.119 05:46:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:34.119 05:46:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.119 05:46:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:34.377 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e3f290ae-785c-4f6d-97a3-831b33a2e21e 00:18:34.636 [2024-07-26 05:46:49.356017] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:34.636 [2024-07-26 05:46:49.356057] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1143e40 00:18:34.636 [2024-07-26 05:46:49.356066] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:34.636 [2024-07-26 05:46:49.356254] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1140e60 00:18:34.636 [2024-07-26 05:46:49.356373] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1143e40 00:18:34.636 [2024-07-26 05:46:49.356383] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1143e40 00:18:34.636 [2024-07-26 05:46:49.356545] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:34.636 NewBaseBdev 00:18:34.636 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:34.636 05:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:34.636 05:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:34.636 05:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:34.636 05:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:34.636 05:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:34.636 05:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:34.894 05:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:35.152 [ 00:18:35.152 { 00:18:35.152 "name": "NewBaseBdev", 00:18:35.152 "aliases": [ 00:18:35.152 "e3f290ae-785c-4f6d-97a3-831b33a2e21e" 00:18:35.152 ], 00:18:35.152 "product_name": "Malloc disk", 00:18:35.152 "block_size": 512, 00:18:35.152 "num_blocks": 65536, 00:18:35.152 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:35.152 "assigned_rate_limits": { 00:18:35.152 "rw_ios_per_sec": 0, 00:18:35.152 "rw_mbytes_per_sec": 0, 00:18:35.152 "r_mbytes_per_sec": 0, 00:18:35.152 "w_mbytes_per_sec": 0 00:18:35.152 }, 00:18:35.152 "claimed": true, 00:18:35.152 "claim_type": "exclusive_write", 00:18:35.152 "zoned": false, 00:18:35.152 "supported_io_types": { 00:18:35.152 "read": true, 00:18:35.152 "write": true, 00:18:35.152 "unmap": true, 00:18:35.152 "flush": true, 00:18:35.152 "reset": true, 00:18:35.152 "nvme_admin": false, 00:18:35.152 "nvme_io": false, 00:18:35.152 "nvme_io_md": false, 00:18:35.152 "write_zeroes": true, 00:18:35.152 "zcopy": true, 00:18:35.152 "get_zone_info": false, 00:18:35.152 "zone_management": false, 00:18:35.152 "zone_append": false, 00:18:35.152 "compare": false, 00:18:35.152 "compare_and_write": false, 00:18:35.152 "abort": true, 00:18:35.152 "seek_hole": false, 00:18:35.152 "seek_data": false, 00:18:35.152 "copy": true, 00:18:35.152 "nvme_iov_md": false 00:18:35.152 }, 00:18:35.152 "memory_domains": [ 00:18:35.152 { 00:18:35.152 "dma_device_id": "system", 00:18:35.152 "dma_device_type": 1 00:18:35.152 }, 00:18:35.152 { 00:18:35.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.152 "dma_device_type": 2 00:18:35.152 } 00:18:35.152 ], 00:18:35.152 "driver_specific": {} 00:18:35.152 } 00:18:35.152 ] 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.152 05:46:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:35.410 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:35.410 "name": "Existed_Raid", 00:18:35.410 "uuid": "ef779bfc-d221-41d3-9366-ba1af717bb53", 00:18:35.410 "strip_size_kb": 0, 00:18:35.410 "state": "online", 00:18:35.410 "raid_level": "raid1", 00:18:35.410 "superblock": false, 00:18:35.410 "num_base_bdevs": 3, 00:18:35.410 "num_base_bdevs_discovered": 3, 00:18:35.410 "num_base_bdevs_operational": 3, 00:18:35.410 "base_bdevs_list": [ 00:18:35.410 { 00:18:35.410 "name": "NewBaseBdev", 00:18:35.410 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:35.410 "is_configured": true, 00:18:35.410 "data_offset": 0, 00:18:35.410 "data_size": 65536 00:18:35.410 }, 00:18:35.410 { 00:18:35.410 "name": "BaseBdev2", 00:18:35.410 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:35.410 "is_configured": true, 00:18:35.410 "data_offset": 0, 00:18:35.410 "data_size": 65536 00:18:35.410 }, 00:18:35.410 { 00:18:35.410 "name": "BaseBdev3", 00:18:35.410 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:35.410 "is_configured": true, 00:18:35.410 "data_offset": 0, 00:18:35.410 "data_size": 65536 00:18:35.410 } 00:18:35.410 ] 00:18:35.410 }' 00:18:35.410 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:35.410 05:46:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:35.976 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:35.976 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:35.976 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:35.976 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:35.976 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:35.976 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:35.976 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:35.976 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:36.233 [2024-07-26 05:46:50.968671] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:36.233 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:36.233 "name": "Existed_Raid", 00:18:36.233 "aliases": [ 00:18:36.233 "ef779bfc-d221-41d3-9366-ba1af717bb53" 00:18:36.233 ], 00:18:36.233 "product_name": "Raid Volume", 00:18:36.233 "block_size": 512, 00:18:36.233 "num_blocks": 65536, 00:18:36.233 "uuid": "ef779bfc-d221-41d3-9366-ba1af717bb53", 00:18:36.233 "assigned_rate_limits": { 00:18:36.233 "rw_ios_per_sec": 0, 00:18:36.233 "rw_mbytes_per_sec": 0, 00:18:36.233 "r_mbytes_per_sec": 0, 00:18:36.233 "w_mbytes_per_sec": 0 00:18:36.233 }, 00:18:36.233 "claimed": false, 00:18:36.233 "zoned": false, 00:18:36.233 "supported_io_types": { 00:18:36.233 "read": true, 00:18:36.233 "write": true, 00:18:36.233 "unmap": false, 00:18:36.233 "flush": false, 00:18:36.233 "reset": true, 00:18:36.233 "nvme_admin": false, 00:18:36.233 "nvme_io": false, 00:18:36.233 "nvme_io_md": false, 00:18:36.233 "write_zeroes": true, 00:18:36.233 "zcopy": false, 00:18:36.233 "get_zone_info": false, 00:18:36.233 "zone_management": false, 00:18:36.233 "zone_append": false, 00:18:36.233 "compare": false, 00:18:36.233 "compare_and_write": false, 00:18:36.233 "abort": false, 00:18:36.233 "seek_hole": false, 00:18:36.233 "seek_data": false, 00:18:36.233 "copy": false, 00:18:36.233 "nvme_iov_md": false 00:18:36.233 }, 00:18:36.233 "memory_domains": [ 00:18:36.233 { 00:18:36.233 "dma_device_id": "system", 00:18:36.233 "dma_device_type": 1 00:18:36.233 }, 00:18:36.233 { 00:18:36.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.233 "dma_device_type": 2 00:18:36.233 }, 00:18:36.233 { 00:18:36.233 "dma_device_id": "system", 00:18:36.233 "dma_device_type": 1 00:18:36.233 }, 00:18:36.233 { 00:18:36.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.233 "dma_device_type": 2 00:18:36.234 }, 00:18:36.234 { 00:18:36.234 "dma_device_id": "system", 00:18:36.234 "dma_device_type": 1 00:18:36.234 }, 00:18:36.234 { 00:18:36.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.234 "dma_device_type": 2 00:18:36.234 } 00:18:36.234 ], 00:18:36.234 "driver_specific": { 00:18:36.234 "raid": { 00:18:36.234 "uuid": "ef779bfc-d221-41d3-9366-ba1af717bb53", 00:18:36.234 "strip_size_kb": 0, 00:18:36.234 "state": "online", 00:18:36.234 "raid_level": "raid1", 00:18:36.234 "superblock": false, 00:18:36.234 "num_base_bdevs": 3, 00:18:36.234 "num_base_bdevs_discovered": 3, 00:18:36.234 "num_base_bdevs_operational": 3, 00:18:36.234 "base_bdevs_list": [ 00:18:36.234 { 00:18:36.234 "name": "NewBaseBdev", 00:18:36.234 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:36.234 "is_configured": true, 00:18:36.234 "data_offset": 0, 00:18:36.234 "data_size": 65536 00:18:36.234 }, 00:18:36.234 { 00:18:36.234 "name": "BaseBdev2", 00:18:36.234 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:36.234 "is_configured": true, 00:18:36.234 "data_offset": 0, 00:18:36.234 "data_size": 65536 00:18:36.234 }, 00:18:36.234 { 00:18:36.234 "name": "BaseBdev3", 00:18:36.234 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:36.234 "is_configured": true, 00:18:36.234 "data_offset": 0, 00:18:36.234 "data_size": 65536 00:18:36.234 } 00:18:36.234 ] 00:18:36.234 } 00:18:36.234 } 00:18:36.234 }' 00:18:36.234 05:46:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:36.234 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:36.234 BaseBdev2 00:18:36.234 BaseBdev3' 00:18:36.234 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:36.234 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:36.234 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:36.492 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:36.492 "name": "NewBaseBdev", 00:18:36.492 "aliases": [ 00:18:36.492 "e3f290ae-785c-4f6d-97a3-831b33a2e21e" 00:18:36.492 ], 00:18:36.492 "product_name": "Malloc disk", 00:18:36.492 "block_size": 512, 00:18:36.492 "num_blocks": 65536, 00:18:36.492 "uuid": "e3f290ae-785c-4f6d-97a3-831b33a2e21e", 00:18:36.492 "assigned_rate_limits": { 00:18:36.492 "rw_ios_per_sec": 0, 00:18:36.492 "rw_mbytes_per_sec": 0, 00:18:36.492 "r_mbytes_per_sec": 0, 00:18:36.492 "w_mbytes_per_sec": 0 00:18:36.492 }, 00:18:36.492 "claimed": true, 00:18:36.492 "claim_type": "exclusive_write", 00:18:36.492 "zoned": false, 00:18:36.492 "supported_io_types": { 00:18:36.492 "read": true, 00:18:36.492 "write": true, 00:18:36.492 "unmap": true, 00:18:36.492 "flush": true, 00:18:36.492 "reset": true, 00:18:36.492 "nvme_admin": false, 00:18:36.492 "nvme_io": false, 00:18:36.492 "nvme_io_md": false, 00:18:36.492 "write_zeroes": true, 00:18:36.492 "zcopy": true, 00:18:36.492 "get_zone_info": false, 00:18:36.492 "zone_management": false, 00:18:36.492 "zone_append": false, 00:18:36.492 "compare": false, 00:18:36.492 "compare_and_write": false, 00:18:36.492 "abort": true, 00:18:36.492 "seek_hole": false, 00:18:36.492 "seek_data": false, 00:18:36.492 "copy": true, 00:18:36.492 "nvme_iov_md": false 00:18:36.492 }, 00:18:36.492 "memory_domains": [ 00:18:36.492 { 00:18:36.492 "dma_device_id": "system", 00:18:36.492 "dma_device_type": 1 00:18:36.492 }, 00:18:36.492 { 00:18:36.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.492 "dma_device_type": 2 00:18:36.492 } 00:18:36.492 ], 00:18:36.492 "driver_specific": {} 00:18:36.492 }' 00:18:36.492 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:36.492 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:36.492 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:36.492 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:36.749 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:36.749 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:36.749 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:36.749 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:36.749 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:36.749 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.749 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.749 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:36.749 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:36.750 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:36.750 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:37.007 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:37.007 "name": "BaseBdev2", 00:18:37.007 "aliases": [ 00:18:37.007 "3381a1ba-6558-4e0b-8f02-43805cae4d38" 00:18:37.007 ], 00:18:37.007 "product_name": "Malloc disk", 00:18:37.007 "block_size": 512, 00:18:37.007 "num_blocks": 65536, 00:18:37.007 "uuid": "3381a1ba-6558-4e0b-8f02-43805cae4d38", 00:18:37.007 "assigned_rate_limits": { 00:18:37.007 "rw_ios_per_sec": 0, 00:18:37.007 "rw_mbytes_per_sec": 0, 00:18:37.007 "r_mbytes_per_sec": 0, 00:18:37.007 "w_mbytes_per_sec": 0 00:18:37.007 }, 00:18:37.007 "claimed": true, 00:18:37.007 "claim_type": "exclusive_write", 00:18:37.007 "zoned": false, 00:18:37.007 "supported_io_types": { 00:18:37.007 "read": true, 00:18:37.007 "write": true, 00:18:37.007 "unmap": true, 00:18:37.007 "flush": true, 00:18:37.007 "reset": true, 00:18:37.007 "nvme_admin": false, 00:18:37.007 "nvme_io": false, 00:18:37.007 "nvme_io_md": false, 00:18:37.007 "write_zeroes": true, 00:18:37.007 "zcopy": true, 00:18:37.007 "get_zone_info": false, 00:18:37.007 "zone_management": false, 00:18:37.007 "zone_append": false, 00:18:37.007 "compare": false, 00:18:37.007 "compare_and_write": false, 00:18:37.007 "abort": true, 00:18:37.007 "seek_hole": false, 00:18:37.007 "seek_data": false, 00:18:37.007 "copy": true, 00:18:37.007 "nvme_iov_md": false 00:18:37.007 }, 00:18:37.007 "memory_domains": [ 00:18:37.007 { 00:18:37.008 "dma_device_id": "system", 00:18:37.008 "dma_device_type": 1 00:18:37.008 }, 00:18:37.008 { 00:18:37.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.008 "dma_device_type": 2 00:18:37.008 } 00:18:37.008 ], 00:18:37.008 "driver_specific": {} 00:18:37.008 }' 00:18:37.008 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.265 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.265 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:37.265 05:46:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.265 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.265 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:37.265 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.265 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.265 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:37.265 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.265 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.522 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:37.522 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:37.522 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:37.522 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:37.779 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:37.779 "name": "BaseBdev3", 00:18:37.779 "aliases": [ 00:18:37.779 "8974b646-345b-4af8-86db-8c3465523b23" 00:18:37.779 ], 00:18:37.779 "product_name": "Malloc disk", 00:18:37.779 "block_size": 512, 00:18:37.779 "num_blocks": 65536, 00:18:37.779 "uuid": "8974b646-345b-4af8-86db-8c3465523b23", 00:18:37.779 "assigned_rate_limits": { 00:18:37.779 "rw_ios_per_sec": 0, 00:18:37.779 "rw_mbytes_per_sec": 0, 00:18:37.779 "r_mbytes_per_sec": 0, 00:18:37.779 "w_mbytes_per_sec": 0 00:18:37.779 }, 00:18:37.779 "claimed": true, 00:18:37.779 "claim_type": "exclusive_write", 00:18:37.779 "zoned": false, 00:18:37.780 "supported_io_types": { 00:18:37.780 "read": true, 00:18:37.780 "write": true, 00:18:37.780 "unmap": true, 00:18:37.780 "flush": true, 00:18:37.780 "reset": true, 00:18:37.780 "nvme_admin": false, 00:18:37.780 "nvme_io": false, 00:18:37.780 "nvme_io_md": false, 00:18:37.780 "write_zeroes": true, 00:18:37.780 "zcopy": true, 00:18:37.780 "get_zone_info": false, 00:18:37.780 "zone_management": false, 00:18:37.780 "zone_append": false, 00:18:37.780 "compare": false, 00:18:37.780 "compare_and_write": false, 00:18:37.780 "abort": true, 00:18:37.780 "seek_hole": false, 00:18:37.780 "seek_data": false, 00:18:37.780 "copy": true, 00:18:37.780 "nvme_iov_md": false 00:18:37.780 }, 00:18:37.780 "memory_domains": [ 00:18:37.780 { 00:18:37.780 "dma_device_id": "system", 00:18:37.780 "dma_device_type": 1 00:18:37.780 }, 00:18:37.780 { 00:18:37.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.780 "dma_device_type": 2 00:18:37.780 } 00:18:37.780 ], 00:18:37.780 "driver_specific": {} 00:18:37.780 }' 00:18:37.780 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.780 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.780 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:37.780 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.780 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.780 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:37.780 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.780 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.780 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:38.037 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.037 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.037 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.037 05:46:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:38.295 [2024-07-26 05:46:53.001742] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:38.295 [2024-07-26 05:46:53.001768] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:38.295 [2024-07-26 05:46:53.001820] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:38.295 [2024-07-26 05:46:53.002084] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:38.295 [2024-07-26 05:46:53.002096] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1143e40 name Existed_Raid, state offline 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1173887 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1173887 ']' 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1173887 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1173887 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1173887' 00:18:38.295 killing process with pid 1173887 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1173887 00:18:38.295 [2024-07-26 05:46:53.073911] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:38.295 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1173887 00:18:38.295 [2024-07-26 05:46:53.101956] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:38.591 00:18:38.591 real 0m29.817s 00:18:38.591 user 0m54.779s 00:18:38.591 sys 0m5.288s 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.591 ************************************ 00:18:38.591 END TEST raid_state_function_test 00:18:38.591 ************************************ 00:18:38.591 05:46:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:38.591 05:46:53 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:18:38.591 05:46:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:38.591 05:46:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:38.591 05:46:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:38.591 ************************************ 00:18:38.591 START TEST raid_state_function_test_sb 00:18:38.591 ************************************ 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1178346 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1178346' 00:18:38.591 Process raid pid: 1178346 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1178346 /var/tmp/spdk-raid.sock 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1178346 ']' 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:38.591 05:46:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:38.592 05:46:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:38.592 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:38.592 05:46:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:38.592 05:46:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:38.592 [2024-07-26 05:46:53.453937] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:18:38.592 [2024-07-26 05:46:53.453982] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:38.857 [2024-07-26 05:46:53.569080] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:38.857 [2024-07-26 05:46:53.676476] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:38.857 [2024-07-26 05:46:53.741171] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:38.857 [2024-07-26 05:46:53.741200] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:39.790 05:46:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:39.790 05:46:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:39.791 [2024-07-26 05:46:54.636134] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:39.791 [2024-07-26 05:46:54.636172] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:39.791 [2024-07-26 05:46:54.636183] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:39.791 [2024-07-26 05:46:54.636195] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:39.791 [2024-07-26 05:46:54.636203] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:39.791 [2024-07-26 05:46:54.636215] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.791 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:40.049 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:40.049 "name": "Existed_Raid", 00:18:40.049 "uuid": "5051c960-53fc-4f15-aff1-c77d0a0c9d5f", 00:18:40.049 "strip_size_kb": 0, 00:18:40.049 "state": "configuring", 00:18:40.049 "raid_level": "raid1", 00:18:40.049 "superblock": true, 00:18:40.049 "num_base_bdevs": 3, 00:18:40.049 "num_base_bdevs_discovered": 0, 00:18:40.049 "num_base_bdevs_operational": 3, 00:18:40.049 "base_bdevs_list": [ 00:18:40.049 { 00:18:40.049 "name": "BaseBdev1", 00:18:40.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:40.049 "is_configured": false, 00:18:40.049 "data_offset": 0, 00:18:40.049 "data_size": 0 00:18:40.049 }, 00:18:40.049 { 00:18:40.049 "name": "BaseBdev2", 00:18:40.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:40.049 "is_configured": false, 00:18:40.049 "data_offset": 0, 00:18:40.049 "data_size": 0 00:18:40.049 }, 00:18:40.049 { 00:18:40.049 "name": "BaseBdev3", 00:18:40.049 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:40.049 "is_configured": false, 00:18:40.049 "data_offset": 0, 00:18:40.049 "data_size": 0 00:18:40.049 } 00:18:40.049 ] 00:18:40.049 }' 00:18:40.049 05:46:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:40.049 05:46:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:40.614 05:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:40.872 [2024-07-26 05:46:55.722874] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:40.872 [2024-07-26 05:46:55.722902] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e37a80 name Existed_Raid, state configuring 00:18:40.872 05:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:41.130 [2024-07-26 05:46:55.959519] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:41.130 [2024-07-26 05:46:55.959548] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:41.130 [2024-07-26 05:46:55.959558] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:41.130 [2024-07-26 05:46:55.959569] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:41.130 [2024-07-26 05:46:55.959577] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:41.130 [2024-07-26 05:46:55.959588] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:41.130 05:46:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:41.387 [2024-07-26 05:46:56.214030] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:41.387 BaseBdev1 00:18:41.387 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:41.387 05:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:41.387 05:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:41.387 05:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:41.387 05:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:41.387 05:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:41.387 05:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:41.645 05:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:41.903 [ 00:18:41.903 { 00:18:41.903 "name": "BaseBdev1", 00:18:41.903 "aliases": [ 00:18:41.903 "cf8047f8-66b1-4f08-b889-fb777b787c36" 00:18:41.903 ], 00:18:41.903 "product_name": "Malloc disk", 00:18:41.903 "block_size": 512, 00:18:41.903 "num_blocks": 65536, 00:18:41.903 "uuid": "cf8047f8-66b1-4f08-b889-fb777b787c36", 00:18:41.903 "assigned_rate_limits": { 00:18:41.903 "rw_ios_per_sec": 0, 00:18:41.903 "rw_mbytes_per_sec": 0, 00:18:41.903 "r_mbytes_per_sec": 0, 00:18:41.903 "w_mbytes_per_sec": 0 00:18:41.903 }, 00:18:41.903 "claimed": true, 00:18:41.903 "claim_type": "exclusive_write", 00:18:41.903 "zoned": false, 00:18:41.903 "supported_io_types": { 00:18:41.903 "read": true, 00:18:41.903 "write": true, 00:18:41.903 "unmap": true, 00:18:41.903 "flush": true, 00:18:41.903 "reset": true, 00:18:41.903 "nvme_admin": false, 00:18:41.903 "nvme_io": false, 00:18:41.903 "nvme_io_md": false, 00:18:41.903 "write_zeroes": true, 00:18:41.903 "zcopy": true, 00:18:41.903 "get_zone_info": false, 00:18:41.903 "zone_management": false, 00:18:41.903 "zone_append": false, 00:18:41.903 "compare": false, 00:18:41.903 "compare_and_write": false, 00:18:41.903 "abort": true, 00:18:41.903 "seek_hole": false, 00:18:41.903 "seek_data": false, 00:18:41.903 "copy": true, 00:18:41.903 "nvme_iov_md": false 00:18:41.903 }, 00:18:41.903 "memory_domains": [ 00:18:41.903 { 00:18:41.903 "dma_device_id": "system", 00:18:41.903 "dma_device_type": 1 00:18:41.903 }, 00:18:41.903 { 00:18:41.903 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.903 "dma_device_type": 2 00:18:41.903 } 00:18:41.903 ], 00:18:41.903 "driver_specific": {} 00:18:41.903 } 00:18:41.903 ] 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.903 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:42.160 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:42.160 "name": "Existed_Raid", 00:18:42.160 "uuid": "8861a300-8801-49c1-9171-6232de91c67b", 00:18:42.160 "strip_size_kb": 0, 00:18:42.160 "state": "configuring", 00:18:42.160 "raid_level": "raid1", 00:18:42.160 "superblock": true, 00:18:42.160 "num_base_bdevs": 3, 00:18:42.160 "num_base_bdevs_discovered": 1, 00:18:42.160 "num_base_bdevs_operational": 3, 00:18:42.160 "base_bdevs_list": [ 00:18:42.160 { 00:18:42.160 "name": "BaseBdev1", 00:18:42.160 "uuid": "cf8047f8-66b1-4f08-b889-fb777b787c36", 00:18:42.160 "is_configured": true, 00:18:42.160 "data_offset": 2048, 00:18:42.160 "data_size": 63488 00:18:42.160 }, 00:18:42.160 { 00:18:42.160 "name": "BaseBdev2", 00:18:42.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.160 "is_configured": false, 00:18:42.160 "data_offset": 0, 00:18:42.160 "data_size": 0 00:18:42.160 }, 00:18:42.160 { 00:18:42.160 "name": "BaseBdev3", 00:18:42.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.160 "is_configured": false, 00:18:42.160 "data_offset": 0, 00:18:42.160 "data_size": 0 00:18:42.160 } 00:18:42.160 ] 00:18:42.160 }' 00:18:42.160 05:46:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:42.160 05:46:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:42.724 05:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:42.981 [2024-07-26 05:46:57.734057] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:42.981 [2024-07-26 05:46:57.734091] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e37310 name Existed_Raid, state configuring 00:18:42.981 05:46:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:43.239 [2024-07-26 05:46:57.982754] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:43.239 [2024-07-26 05:46:57.984185] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:43.239 [2024-07-26 05:46:57.984217] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:43.239 [2024-07-26 05:46:57.984227] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:43.239 [2024-07-26 05:46:57.984238] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.239 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.497 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.497 "name": "Existed_Raid", 00:18:43.497 "uuid": "6632f8cc-e06e-4f22-87d2-1602dce23276", 00:18:43.497 "strip_size_kb": 0, 00:18:43.497 "state": "configuring", 00:18:43.497 "raid_level": "raid1", 00:18:43.497 "superblock": true, 00:18:43.497 "num_base_bdevs": 3, 00:18:43.497 "num_base_bdevs_discovered": 1, 00:18:43.497 "num_base_bdevs_operational": 3, 00:18:43.497 "base_bdevs_list": [ 00:18:43.497 { 00:18:43.497 "name": "BaseBdev1", 00:18:43.497 "uuid": "cf8047f8-66b1-4f08-b889-fb777b787c36", 00:18:43.497 "is_configured": true, 00:18:43.497 "data_offset": 2048, 00:18:43.497 "data_size": 63488 00:18:43.497 }, 00:18:43.497 { 00:18:43.497 "name": "BaseBdev2", 00:18:43.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.497 "is_configured": false, 00:18:43.497 "data_offset": 0, 00:18:43.497 "data_size": 0 00:18:43.497 }, 00:18:43.497 { 00:18:43.497 "name": "BaseBdev3", 00:18:43.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.497 "is_configured": false, 00:18:43.497 "data_offset": 0, 00:18:43.497 "data_size": 0 00:18:43.497 } 00:18:43.497 ] 00:18:43.497 }' 00:18:43.497 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.497 05:46:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:44.063 05:46:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:44.321 [2024-07-26 05:46:59.012914] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:44.321 BaseBdev2 00:18:44.321 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:44.321 05:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:44.321 05:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:44.321 05:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:44.321 05:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:44.321 05:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:44.321 05:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:44.583 05:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:44.842 [ 00:18:44.842 { 00:18:44.842 "name": "BaseBdev2", 00:18:44.842 "aliases": [ 00:18:44.842 "da59d946-5830-4867-b30a-cc3460e33fd3" 00:18:44.842 ], 00:18:44.842 "product_name": "Malloc disk", 00:18:44.842 "block_size": 512, 00:18:44.842 "num_blocks": 65536, 00:18:44.842 "uuid": "da59d946-5830-4867-b30a-cc3460e33fd3", 00:18:44.842 "assigned_rate_limits": { 00:18:44.842 "rw_ios_per_sec": 0, 00:18:44.842 "rw_mbytes_per_sec": 0, 00:18:44.842 "r_mbytes_per_sec": 0, 00:18:44.842 "w_mbytes_per_sec": 0 00:18:44.842 }, 00:18:44.842 "claimed": true, 00:18:44.842 "claim_type": "exclusive_write", 00:18:44.842 "zoned": false, 00:18:44.842 "supported_io_types": { 00:18:44.842 "read": true, 00:18:44.842 "write": true, 00:18:44.842 "unmap": true, 00:18:44.842 "flush": true, 00:18:44.842 "reset": true, 00:18:44.842 "nvme_admin": false, 00:18:44.842 "nvme_io": false, 00:18:44.842 "nvme_io_md": false, 00:18:44.842 "write_zeroes": true, 00:18:44.842 "zcopy": true, 00:18:44.842 "get_zone_info": false, 00:18:44.842 "zone_management": false, 00:18:44.842 "zone_append": false, 00:18:44.842 "compare": false, 00:18:44.842 "compare_and_write": false, 00:18:44.842 "abort": true, 00:18:44.842 "seek_hole": false, 00:18:44.842 "seek_data": false, 00:18:44.842 "copy": true, 00:18:44.842 "nvme_iov_md": false 00:18:44.842 }, 00:18:44.842 "memory_domains": [ 00:18:44.842 { 00:18:44.842 "dma_device_id": "system", 00:18:44.842 "dma_device_type": 1 00:18:44.842 }, 00:18:44.842 { 00:18:44.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.842 "dma_device_type": 2 00:18:44.842 } 00:18:44.842 ], 00:18:44.842 "driver_specific": {} 00:18:44.842 } 00:18:44.842 ] 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.842 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.101 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.101 "name": "Existed_Raid", 00:18:45.101 "uuid": "6632f8cc-e06e-4f22-87d2-1602dce23276", 00:18:45.101 "strip_size_kb": 0, 00:18:45.101 "state": "configuring", 00:18:45.101 "raid_level": "raid1", 00:18:45.101 "superblock": true, 00:18:45.101 "num_base_bdevs": 3, 00:18:45.101 "num_base_bdevs_discovered": 2, 00:18:45.101 "num_base_bdevs_operational": 3, 00:18:45.101 "base_bdevs_list": [ 00:18:45.101 { 00:18:45.101 "name": "BaseBdev1", 00:18:45.101 "uuid": "cf8047f8-66b1-4f08-b889-fb777b787c36", 00:18:45.101 "is_configured": true, 00:18:45.101 "data_offset": 2048, 00:18:45.101 "data_size": 63488 00:18:45.101 }, 00:18:45.101 { 00:18:45.101 "name": "BaseBdev2", 00:18:45.101 "uuid": "da59d946-5830-4867-b30a-cc3460e33fd3", 00:18:45.101 "is_configured": true, 00:18:45.101 "data_offset": 2048, 00:18:45.101 "data_size": 63488 00:18:45.101 }, 00:18:45.101 { 00:18:45.101 "name": "BaseBdev3", 00:18:45.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.101 "is_configured": false, 00:18:45.101 "data_offset": 0, 00:18:45.101 "data_size": 0 00:18:45.101 } 00:18:45.101 ] 00:18:45.101 }' 00:18:45.101 05:46:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.101 05:46:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:45.669 05:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:45.927 [2024-07-26 05:47:00.604567] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:45.927 [2024-07-26 05:47:00.604730] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e38400 00:18:45.927 [2024-07-26 05:47:00.604745] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:45.927 [2024-07-26 05:47:00.604917] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e37ef0 00:18:45.927 [2024-07-26 05:47:00.605035] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e38400 00:18:45.927 [2024-07-26 05:47:00.605045] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e38400 00:18:45.927 [2024-07-26 05:47:00.605137] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:45.927 BaseBdev3 00:18:45.927 05:47:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:45.927 05:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:45.927 05:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:45.927 05:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:45.927 05:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:45.927 05:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:45.927 05:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:46.184 05:47:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:46.441 [ 00:18:46.441 { 00:18:46.441 "name": "BaseBdev3", 00:18:46.441 "aliases": [ 00:18:46.441 "1d23e622-6e92-4de9-8703-dcbf8bcc8546" 00:18:46.441 ], 00:18:46.441 "product_name": "Malloc disk", 00:18:46.441 "block_size": 512, 00:18:46.441 "num_blocks": 65536, 00:18:46.441 "uuid": "1d23e622-6e92-4de9-8703-dcbf8bcc8546", 00:18:46.441 "assigned_rate_limits": { 00:18:46.441 "rw_ios_per_sec": 0, 00:18:46.441 "rw_mbytes_per_sec": 0, 00:18:46.441 "r_mbytes_per_sec": 0, 00:18:46.441 "w_mbytes_per_sec": 0 00:18:46.441 }, 00:18:46.441 "claimed": true, 00:18:46.441 "claim_type": "exclusive_write", 00:18:46.441 "zoned": false, 00:18:46.441 "supported_io_types": { 00:18:46.441 "read": true, 00:18:46.441 "write": true, 00:18:46.441 "unmap": true, 00:18:46.441 "flush": true, 00:18:46.441 "reset": true, 00:18:46.441 "nvme_admin": false, 00:18:46.441 "nvme_io": false, 00:18:46.441 "nvme_io_md": false, 00:18:46.441 "write_zeroes": true, 00:18:46.441 "zcopy": true, 00:18:46.441 "get_zone_info": false, 00:18:46.441 "zone_management": false, 00:18:46.441 "zone_append": false, 00:18:46.441 "compare": false, 00:18:46.441 "compare_and_write": false, 00:18:46.441 "abort": true, 00:18:46.441 "seek_hole": false, 00:18:46.441 "seek_data": false, 00:18:46.441 "copy": true, 00:18:46.441 "nvme_iov_md": false 00:18:46.441 }, 00:18:46.441 "memory_domains": [ 00:18:46.441 { 00:18:46.441 "dma_device_id": "system", 00:18:46.441 "dma_device_type": 1 00:18:46.441 }, 00:18:46.441 { 00:18:46.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.441 "dma_device_type": 2 00:18:46.441 } 00:18:46.441 ], 00:18:46.441 "driver_specific": {} 00:18:46.441 } 00:18:46.441 ] 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.441 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.442 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.442 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.700 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.700 "name": "Existed_Raid", 00:18:46.700 "uuid": "6632f8cc-e06e-4f22-87d2-1602dce23276", 00:18:46.700 "strip_size_kb": 0, 00:18:46.700 "state": "online", 00:18:46.700 "raid_level": "raid1", 00:18:46.700 "superblock": true, 00:18:46.700 "num_base_bdevs": 3, 00:18:46.700 "num_base_bdevs_discovered": 3, 00:18:46.700 "num_base_bdevs_operational": 3, 00:18:46.700 "base_bdevs_list": [ 00:18:46.700 { 00:18:46.700 "name": "BaseBdev1", 00:18:46.700 "uuid": "cf8047f8-66b1-4f08-b889-fb777b787c36", 00:18:46.700 "is_configured": true, 00:18:46.700 "data_offset": 2048, 00:18:46.700 "data_size": 63488 00:18:46.700 }, 00:18:46.700 { 00:18:46.700 "name": "BaseBdev2", 00:18:46.700 "uuid": "da59d946-5830-4867-b30a-cc3460e33fd3", 00:18:46.700 "is_configured": true, 00:18:46.700 "data_offset": 2048, 00:18:46.700 "data_size": 63488 00:18:46.700 }, 00:18:46.700 { 00:18:46.700 "name": "BaseBdev3", 00:18:46.700 "uuid": "1d23e622-6e92-4de9-8703-dcbf8bcc8546", 00:18:46.700 "is_configured": true, 00:18:46.700 "data_offset": 2048, 00:18:46.700 "data_size": 63488 00:18:46.700 } 00:18:46.700 ] 00:18:46.700 }' 00:18:46.700 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.700 05:47:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:47.267 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:47.267 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:47.267 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:47.267 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:47.267 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:47.267 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:47.267 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:47.267 05:47:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:47.267 [2024-07-26 05:47:02.112873] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:47.267 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:47.267 "name": "Existed_Raid", 00:18:47.267 "aliases": [ 00:18:47.267 "6632f8cc-e06e-4f22-87d2-1602dce23276" 00:18:47.267 ], 00:18:47.267 "product_name": "Raid Volume", 00:18:47.267 "block_size": 512, 00:18:47.267 "num_blocks": 63488, 00:18:47.267 "uuid": "6632f8cc-e06e-4f22-87d2-1602dce23276", 00:18:47.267 "assigned_rate_limits": { 00:18:47.267 "rw_ios_per_sec": 0, 00:18:47.267 "rw_mbytes_per_sec": 0, 00:18:47.267 "r_mbytes_per_sec": 0, 00:18:47.267 "w_mbytes_per_sec": 0 00:18:47.267 }, 00:18:47.267 "claimed": false, 00:18:47.267 "zoned": false, 00:18:47.267 "supported_io_types": { 00:18:47.267 "read": true, 00:18:47.267 "write": true, 00:18:47.267 "unmap": false, 00:18:47.267 "flush": false, 00:18:47.267 "reset": true, 00:18:47.267 "nvme_admin": false, 00:18:47.267 "nvme_io": false, 00:18:47.267 "nvme_io_md": false, 00:18:47.267 "write_zeroes": true, 00:18:47.267 "zcopy": false, 00:18:47.268 "get_zone_info": false, 00:18:47.268 "zone_management": false, 00:18:47.268 "zone_append": false, 00:18:47.268 "compare": false, 00:18:47.268 "compare_and_write": false, 00:18:47.268 "abort": false, 00:18:47.268 "seek_hole": false, 00:18:47.268 "seek_data": false, 00:18:47.268 "copy": false, 00:18:47.268 "nvme_iov_md": false 00:18:47.268 }, 00:18:47.268 "memory_domains": [ 00:18:47.268 { 00:18:47.268 "dma_device_id": "system", 00:18:47.268 "dma_device_type": 1 00:18:47.268 }, 00:18:47.268 { 00:18:47.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.268 "dma_device_type": 2 00:18:47.268 }, 00:18:47.268 { 00:18:47.268 "dma_device_id": "system", 00:18:47.268 "dma_device_type": 1 00:18:47.268 }, 00:18:47.268 { 00:18:47.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.268 "dma_device_type": 2 00:18:47.268 }, 00:18:47.268 { 00:18:47.268 "dma_device_id": "system", 00:18:47.268 "dma_device_type": 1 00:18:47.268 }, 00:18:47.268 { 00:18:47.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.268 "dma_device_type": 2 00:18:47.268 } 00:18:47.268 ], 00:18:47.268 "driver_specific": { 00:18:47.268 "raid": { 00:18:47.268 "uuid": "6632f8cc-e06e-4f22-87d2-1602dce23276", 00:18:47.268 "strip_size_kb": 0, 00:18:47.268 "state": "online", 00:18:47.268 "raid_level": "raid1", 00:18:47.268 "superblock": true, 00:18:47.268 "num_base_bdevs": 3, 00:18:47.268 "num_base_bdevs_discovered": 3, 00:18:47.268 "num_base_bdevs_operational": 3, 00:18:47.268 "base_bdevs_list": [ 00:18:47.268 { 00:18:47.268 "name": "BaseBdev1", 00:18:47.268 "uuid": "cf8047f8-66b1-4f08-b889-fb777b787c36", 00:18:47.268 "is_configured": true, 00:18:47.268 "data_offset": 2048, 00:18:47.268 "data_size": 63488 00:18:47.268 }, 00:18:47.268 { 00:18:47.268 "name": "BaseBdev2", 00:18:47.268 "uuid": "da59d946-5830-4867-b30a-cc3460e33fd3", 00:18:47.268 "is_configured": true, 00:18:47.268 "data_offset": 2048, 00:18:47.268 "data_size": 63488 00:18:47.268 }, 00:18:47.268 { 00:18:47.268 "name": "BaseBdev3", 00:18:47.268 "uuid": "1d23e622-6e92-4de9-8703-dcbf8bcc8546", 00:18:47.268 "is_configured": true, 00:18:47.268 "data_offset": 2048, 00:18:47.268 "data_size": 63488 00:18:47.268 } 00:18:47.268 ] 00:18:47.268 } 00:18:47.268 } 00:18:47.268 }' 00:18:47.268 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:47.527 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:47.527 BaseBdev2 00:18:47.527 BaseBdev3' 00:18:47.527 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:47.527 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:47.527 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:47.785 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:47.785 "name": "BaseBdev1", 00:18:47.785 "aliases": [ 00:18:47.785 "cf8047f8-66b1-4f08-b889-fb777b787c36" 00:18:47.785 ], 00:18:47.785 "product_name": "Malloc disk", 00:18:47.785 "block_size": 512, 00:18:47.785 "num_blocks": 65536, 00:18:47.785 "uuid": "cf8047f8-66b1-4f08-b889-fb777b787c36", 00:18:47.785 "assigned_rate_limits": { 00:18:47.785 "rw_ios_per_sec": 0, 00:18:47.785 "rw_mbytes_per_sec": 0, 00:18:47.785 "r_mbytes_per_sec": 0, 00:18:47.785 "w_mbytes_per_sec": 0 00:18:47.785 }, 00:18:47.785 "claimed": true, 00:18:47.785 "claim_type": "exclusive_write", 00:18:47.785 "zoned": false, 00:18:47.785 "supported_io_types": { 00:18:47.785 "read": true, 00:18:47.785 "write": true, 00:18:47.785 "unmap": true, 00:18:47.785 "flush": true, 00:18:47.785 "reset": true, 00:18:47.785 "nvme_admin": false, 00:18:47.785 "nvme_io": false, 00:18:47.785 "nvme_io_md": false, 00:18:47.785 "write_zeroes": true, 00:18:47.785 "zcopy": true, 00:18:47.785 "get_zone_info": false, 00:18:47.785 "zone_management": false, 00:18:47.785 "zone_append": false, 00:18:47.785 "compare": false, 00:18:47.785 "compare_and_write": false, 00:18:47.785 "abort": true, 00:18:47.785 "seek_hole": false, 00:18:47.785 "seek_data": false, 00:18:47.785 "copy": true, 00:18:47.785 "nvme_iov_md": false 00:18:47.785 }, 00:18:47.785 "memory_domains": [ 00:18:47.785 { 00:18:47.785 "dma_device_id": "system", 00:18:47.785 "dma_device_type": 1 00:18:47.785 }, 00:18:47.785 { 00:18:47.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.785 "dma_device_type": 2 00:18:47.785 } 00:18:47.785 ], 00:18:47.785 "driver_specific": {} 00:18:47.785 }' 00:18:47.785 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.785 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:47.785 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:47.785 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.785 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:47.785 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:47.785 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:47.785 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.044 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:48.044 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.044 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.044 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:48.044 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:48.044 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:48.044 05:47:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:48.304 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:48.304 "name": "BaseBdev2", 00:18:48.304 "aliases": [ 00:18:48.304 "da59d946-5830-4867-b30a-cc3460e33fd3" 00:18:48.304 ], 00:18:48.304 "product_name": "Malloc disk", 00:18:48.304 "block_size": 512, 00:18:48.304 "num_blocks": 65536, 00:18:48.304 "uuid": "da59d946-5830-4867-b30a-cc3460e33fd3", 00:18:48.304 "assigned_rate_limits": { 00:18:48.304 "rw_ios_per_sec": 0, 00:18:48.304 "rw_mbytes_per_sec": 0, 00:18:48.304 "r_mbytes_per_sec": 0, 00:18:48.304 "w_mbytes_per_sec": 0 00:18:48.304 }, 00:18:48.304 "claimed": true, 00:18:48.304 "claim_type": "exclusive_write", 00:18:48.304 "zoned": false, 00:18:48.304 "supported_io_types": { 00:18:48.304 "read": true, 00:18:48.304 "write": true, 00:18:48.304 "unmap": true, 00:18:48.304 "flush": true, 00:18:48.304 "reset": true, 00:18:48.304 "nvme_admin": false, 00:18:48.304 "nvme_io": false, 00:18:48.304 "nvme_io_md": false, 00:18:48.304 "write_zeroes": true, 00:18:48.304 "zcopy": true, 00:18:48.304 "get_zone_info": false, 00:18:48.304 "zone_management": false, 00:18:48.304 "zone_append": false, 00:18:48.304 "compare": false, 00:18:48.304 "compare_and_write": false, 00:18:48.304 "abort": true, 00:18:48.304 "seek_hole": false, 00:18:48.304 "seek_data": false, 00:18:48.304 "copy": true, 00:18:48.304 "nvme_iov_md": false 00:18:48.304 }, 00:18:48.304 "memory_domains": [ 00:18:48.304 { 00:18:48.304 "dma_device_id": "system", 00:18:48.304 "dma_device_type": 1 00:18:48.304 }, 00:18:48.304 { 00:18:48.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.304 "dma_device_type": 2 00:18:48.304 } 00:18:48.304 ], 00:18:48.304 "driver_specific": {} 00:18:48.304 }' 00:18:48.304 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.304 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.304 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:48.304 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:48.563 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:48.820 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:48.820 "name": "BaseBdev3", 00:18:48.820 "aliases": [ 00:18:48.820 "1d23e622-6e92-4de9-8703-dcbf8bcc8546" 00:18:48.820 ], 00:18:48.820 "product_name": "Malloc disk", 00:18:48.820 "block_size": 512, 00:18:48.820 "num_blocks": 65536, 00:18:48.820 "uuid": "1d23e622-6e92-4de9-8703-dcbf8bcc8546", 00:18:48.820 "assigned_rate_limits": { 00:18:48.820 "rw_ios_per_sec": 0, 00:18:48.820 "rw_mbytes_per_sec": 0, 00:18:48.820 "r_mbytes_per_sec": 0, 00:18:48.820 "w_mbytes_per_sec": 0 00:18:48.820 }, 00:18:48.820 "claimed": true, 00:18:48.820 "claim_type": "exclusive_write", 00:18:48.820 "zoned": false, 00:18:48.820 "supported_io_types": { 00:18:48.820 "read": true, 00:18:48.820 "write": true, 00:18:48.820 "unmap": true, 00:18:48.820 "flush": true, 00:18:48.820 "reset": true, 00:18:48.820 "nvme_admin": false, 00:18:48.820 "nvme_io": false, 00:18:48.820 "nvme_io_md": false, 00:18:48.820 "write_zeroes": true, 00:18:48.820 "zcopy": true, 00:18:48.820 "get_zone_info": false, 00:18:48.820 "zone_management": false, 00:18:48.820 "zone_append": false, 00:18:48.820 "compare": false, 00:18:48.820 "compare_and_write": false, 00:18:48.820 "abort": true, 00:18:48.820 "seek_hole": false, 00:18:48.820 "seek_data": false, 00:18:48.820 "copy": true, 00:18:48.820 "nvme_iov_md": false 00:18:48.820 }, 00:18:48.820 "memory_domains": [ 00:18:48.820 { 00:18:48.820 "dma_device_id": "system", 00:18:48.820 "dma_device_type": 1 00:18:48.820 }, 00:18:48.820 { 00:18:48.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.820 "dma_device_type": 2 00:18:48.820 } 00:18:48.820 ], 00:18:48.820 "driver_specific": {} 00:18:48.820 }' 00:18:48.820 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.820 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.078 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:49.078 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.078 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.078 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:49.078 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.078 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.078 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:49.078 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.078 05:47:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.336 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:49.336 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:49.594 [2024-07-26 05:47:04.246259] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.594 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:49.852 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:49.852 "name": "Existed_Raid", 00:18:49.852 "uuid": "6632f8cc-e06e-4f22-87d2-1602dce23276", 00:18:49.852 "strip_size_kb": 0, 00:18:49.852 "state": "online", 00:18:49.852 "raid_level": "raid1", 00:18:49.852 "superblock": true, 00:18:49.852 "num_base_bdevs": 3, 00:18:49.852 "num_base_bdevs_discovered": 2, 00:18:49.852 "num_base_bdevs_operational": 2, 00:18:49.852 "base_bdevs_list": [ 00:18:49.852 { 00:18:49.852 "name": null, 00:18:49.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:49.852 "is_configured": false, 00:18:49.852 "data_offset": 2048, 00:18:49.852 "data_size": 63488 00:18:49.852 }, 00:18:49.852 { 00:18:49.852 "name": "BaseBdev2", 00:18:49.852 "uuid": "da59d946-5830-4867-b30a-cc3460e33fd3", 00:18:49.852 "is_configured": true, 00:18:49.852 "data_offset": 2048, 00:18:49.852 "data_size": 63488 00:18:49.852 }, 00:18:49.852 { 00:18:49.852 "name": "BaseBdev3", 00:18:49.852 "uuid": "1d23e622-6e92-4de9-8703-dcbf8bcc8546", 00:18:49.852 "is_configured": true, 00:18:49.852 "data_offset": 2048, 00:18:49.852 "data_size": 63488 00:18:49.852 } 00:18:49.852 ] 00:18:49.852 }' 00:18:49.852 05:47:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:49.852 05:47:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:50.419 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:50.419 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:50.419 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:50.419 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.678 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:50.678 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:50.678 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:50.678 [2024-07-26 05:47:05.490580] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:50.678 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:50.678 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:50.678 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.678 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:50.936 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:50.936 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:50.936 05:47:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:51.194 [2024-07-26 05:47:05.988398] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:51.194 [2024-07-26 05:47:05.988488] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:51.194 [2024-07-26 05:47:05.999248] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:51.194 [2024-07-26 05:47:05.999283] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:51.194 [2024-07-26 05:47:05.999295] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e38400 name Existed_Raid, state offline 00:18:51.194 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:51.194 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:51.194 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.194 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:51.453 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:51.453 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:51.453 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:18:51.453 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:51.453 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:51.453 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:51.712 BaseBdev2 00:18:51.712 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:51.712 05:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:51.712 05:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:51.712 05:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:51.712 05:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:51.712 05:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:51.712 05:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:51.971 05:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:52.231 [ 00:18:52.231 { 00:18:52.231 "name": "BaseBdev2", 00:18:52.231 "aliases": [ 00:18:52.231 "65452d55-4dd3-44af-b1c3-6dd22379315f" 00:18:52.231 ], 00:18:52.231 "product_name": "Malloc disk", 00:18:52.231 "block_size": 512, 00:18:52.231 "num_blocks": 65536, 00:18:52.231 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:18:52.231 "assigned_rate_limits": { 00:18:52.231 "rw_ios_per_sec": 0, 00:18:52.231 "rw_mbytes_per_sec": 0, 00:18:52.231 "r_mbytes_per_sec": 0, 00:18:52.231 "w_mbytes_per_sec": 0 00:18:52.231 }, 00:18:52.231 "claimed": false, 00:18:52.231 "zoned": false, 00:18:52.231 "supported_io_types": { 00:18:52.231 "read": true, 00:18:52.231 "write": true, 00:18:52.231 "unmap": true, 00:18:52.231 "flush": true, 00:18:52.231 "reset": true, 00:18:52.231 "nvme_admin": false, 00:18:52.231 "nvme_io": false, 00:18:52.231 "nvme_io_md": false, 00:18:52.231 "write_zeroes": true, 00:18:52.231 "zcopy": true, 00:18:52.231 "get_zone_info": false, 00:18:52.231 "zone_management": false, 00:18:52.231 "zone_append": false, 00:18:52.231 "compare": false, 00:18:52.231 "compare_and_write": false, 00:18:52.231 "abort": true, 00:18:52.231 "seek_hole": false, 00:18:52.231 "seek_data": false, 00:18:52.231 "copy": true, 00:18:52.231 "nvme_iov_md": false 00:18:52.231 }, 00:18:52.231 "memory_domains": [ 00:18:52.231 { 00:18:52.231 "dma_device_id": "system", 00:18:52.231 "dma_device_type": 1 00:18:52.231 }, 00:18:52.231 { 00:18:52.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.231 "dma_device_type": 2 00:18:52.231 } 00:18:52.231 ], 00:18:52.231 "driver_specific": {} 00:18:52.231 } 00:18:52.231 ] 00:18:52.231 05:47:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:52.231 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:52.231 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:52.231 05:47:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:52.490 BaseBdev3 00:18:52.490 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:52.490 05:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:52.490 05:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:52.490 05:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:52.490 05:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:52.490 05:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:52.490 05:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:52.748 05:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:53.007 [ 00:18:53.007 { 00:18:53.007 "name": "BaseBdev3", 00:18:53.007 "aliases": [ 00:18:53.007 "db882b7f-5bb5-45f8-8c89-0d813419ccc6" 00:18:53.007 ], 00:18:53.007 "product_name": "Malloc disk", 00:18:53.007 "block_size": 512, 00:18:53.007 "num_blocks": 65536, 00:18:53.007 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:18:53.007 "assigned_rate_limits": { 00:18:53.007 "rw_ios_per_sec": 0, 00:18:53.007 "rw_mbytes_per_sec": 0, 00:18:53.007 "r_mbytes_per_sec": 0, 00:18:53.007 "w_mbytes_per_sec": 0 00:18:53.007 }, 00:18:53.007 "claimed": false, 00:18:53.007 "zoned": false, 00:18:53.007 "supported_io_types": { 00:18:53.007 "read": true, 00:18:53.007 "write": true, 00:18:53.007 "unmap": true, 00:18:53.007 "flush": true, 00:18:53.007 "reset": true, 00:18:53.007 "nvme_admin": false, 00:18:53.007 "nvme_io": false, 00:18:53.007 "nvme_io_md": false, 00:18:53.007 "write_zeroes": true, 00:18:53.007 "zcopy": true, 00:18:53.007 "get_zone_info": false, 00:18:53.007 "zone_management": false, 00:18:53.007 "zone_append": false, 00:18:53.007 "compare": false, 00:18:53.007 "compare_and_write": false, 00:18:53.007 "abort": true, 00:18:53.007 "seek_hole": false, 00:18:53.007 "seek_data": false, 00:18:53.007 "copy": true, 00:18:53.007 "nvme_iov_md": false 00:18:53.007 }, 00:18:53.007 "memory_domains": [ 00:18:53.007 { 00:18:53.007 "dma_device_id": "system", 00:18:53.007 "dma_device_type": 1 00:18:53.007 }, 00:18:53.007 { 00:18:53.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.007 "dma_device_type": 2 00:18:53.007 } 00:18:53.007 ], 00:18:53.007 "driver_specific": {} 00:18:53.007 } 00:18:53.007 ] 00:18:53.007 05:47:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:53.007 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:53.007 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:53.007 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:53.267 [2024-07-26 05:47:07.937016] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:53.267 [2024-07-26 05:47:07.937055] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:53.267 [2024-07-26 05:47:07.937073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:53.267 [2024-07-26 05:47:07.938433] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.267 05:47:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:53.527 05:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.527 "name": "Existed_Raid", 00:18:53.527 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:18:53.527 "strip_size_kb": 0, 00:18:53.527 "state": "configuring", 00:18:53.527 "raid_level": "raid1", 00:18:53.527 "superblock": true, 00:18:53.527 "num_base_bdevs": 3, 00:18:53.527 "num_base_bdevs_discovered": 2, 00:18:53.527 "num_base_bdevs_operational": 3, 00:18:53.527 "base_bdevs_list": [ 00:18:53.527 { 00:18:53.527 "name": "BaseBdev1", 00:18:53.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.527 "is_configured": false, 00:18:53.527 "data_offset": 0, 00:18:53.527 "data_size": 0 00:18:53.527 }, 00:18:53.527 { 00:18:53.527 "name": "BaseBdev2", 00:18:53.527 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:18:53.527 "is_configured": true, 00:18:53.527 "data_offset": 2048, 00:18:53.527 "data_size": 63488 00:18:53.527 }, 00:18:53.527 { 00:18:53.527 "name": "BaseBdev3", 00:18:53.527 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:18:53.527 "is_configured": true, 00:18:53.527 "data_offset": 2048, 00:18:53.527 "data_size": 63488 00:18:53.527 } 00:18:53.527 ] 00:18:53.527 }' 00:18:53.527 05:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.527 05:47:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:54.097 05:47:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:54.357 [2024-07-26 05:47:09.043922] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.357 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:54.616 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.616 "name": "Existed_Raid", 00:18:54.616 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:18:54.616 "strip_size_kb": 0, 00:18:54.616 "state": "configuring", 00:18:54.616 "raid_level": "raid1", 00:18:54.617 "superblock": true, 00:18:54.617 "num_base_bdevs": 3, 00:18:54.617 "num_base_bdevs_discovered": 1, 00:18:54.617 "num_base_bdevs_operational": 3, 00:18:54.617 "base_bdevs_list": [ 00:18:54.617 { 00:18:54.617 "name": "BaseBdev1", 00:18:54.617 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.617 "is_configured": false, 00:18:54.617 "data_offset": 0, 00:18:54.617 "data_size": 0 00:18:54.617 }, 00:18:54.617 { 00:18:54.617 "name": null, 00:18:54.617 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:18:54.617 "is_configured": false, 00:18:54.617 "data_offset": 2048, 00:18:54.617 "data_size": 63488 00:18:54.617 }, 00:18:54.617 { 00:18:54.617 "name": "BaseBdev3", 00:18:54.617 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:18:54.617 "is_configured": true, 00:18:54.617 "data_offset": 2048, 00:18:54.617 "data_size": 63488 00:18:54.617 } 00:18:54.617 ] 00:18:54.617 }' 00:18:54.617 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.617 05:47:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:55.230 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.230 05:47:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:55.489 05:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:55.489 05:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:55.748 [2024-07-26 05:47:10.643508] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:55.748 BaseBdev1 00:18:56.008 05:47:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:56.008 05:47:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:56.008 05:47:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:56.008 05:47:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:56.008 05:47:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:56.008 05:47:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:56.008 05:47:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:56.008 05:47:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:56.269 [ 00:18:56.269 { 00:18:56.269 "name": "BaseBdev1", 00:18:56.269 "aliases": [ 00:18:56.269 "a9510d9c-900a-4556-b598-4152a7c37e1b" 00:18:56.269 ], 00:18:56.269 "product_name": "Malloc disk", 00:18:56.269 "block_size": 512, 00:18:56.269 "num_blocks": 65536, 00:18:56.269 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:18:56.269 "assigned_rate_limits": { 00:18:56.269 "rw_ios_per_sec": 0, 00:18:56.269 "rw_mbytes_per_sec": 0, 00:18:56.269 "r_mbytes_per_sec": 0, 00:18:56.269 "w_mbytes_per_sec": 0 00:18:56.269 }, 00:18:56.269 "claimed": true, 00:18:56.269 "claim_type": "exclusive_write", 00:18:56.269 "zoned": false, 00:18:56.269 "supported_io_types": { 00:18:56.269 "read": true, 00:18:56.269 "write": true, 00:18:56.269 "unmap": true, 00:18:56.269 "flush": true, 00:18:56.269 "reset": true, 00:18:56.269 "nvme_admin": false, 00:18:56.269 "nvme_io": false, 00:18:56.269 "nvme_io_md": false, 00:18:56.269 "write_zeroes": true, 00:18:56.269 "zcopy": true, 00:18:56.269 "get_zone_info": false, 00:18:56.269 "zone_management": false, 00:18:56.269 "zone_append": false, 00:18:56.269 "compare": false, 00:18:56.269 "compare_and_write": false, 00:18:56.269 "abort": true, 00:18:56.269 "seek_hole": false, 00:18:56.269 "seek_data": false, 00:18:56.269 "copy": true, 00:18:56.269 "nvme_iov_md": false 00:18:56.269 }, 00:18:56.269 "memory_domains": [ 00:18:56.269 { 00:18:56.269 "dma_device_id": "system", 00:18:56.269 "dma_device_type": 1 00:18:56.269 }, 00:18:56.269 { 00:18:56.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:56.269 "dma_device_type": 2 00:18:56.269 } 00:18:56.269 ], 00:18:56.269 "driver_specific": {} 00:18:56.269 } 00:18:56.269 ] 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.269 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:56.528 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.528 "name": "Existed_Raid", 00:18:56.528 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:18:56.528 "strip_size_kb": 0, 00:18:56.528 "state": "configuring", 00:18:56.528 "raid_level": "raid1", 00:18:56.528 "superblock": true, 00:18:56.528 "num_base_bdevs": 3, 00:18:56.528 "num_base_bdevs_discovered": 2, 00:18:56.528 "num_base_bdevs_operational": 3, 00:18:56.528 "base_bdevs_list": [ 00:18:56.528 { 00:18:56.528 "name": "BaseBdev1", 00:18:56.528 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:18:56.528 "is_configured": true, 00:18:56.528 "data_offset": 2048, 00:18:56.528 "data_size": 63488 00:18:56.528 }, 00:18:56.528 { 00:18:56.528 "name": null, 00:18:56.528 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:18:56.528 "is_configured": false, 00:18:56.528 "data_offset": 2048, 00:18:56.528 "data_size": 63488 00:18:56.528 }, 00:18:56.528 { 00:18:56.528 "name": "BaseBdev3", 00:18:56.528 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:18:56.528 "is_configured": true, 00:18:56.528 "data_offset": 2048, 00:18:56.528 "data_size": 63488 00:18:56.528 } 00:18:56.528 ] 00:18:56.528 }' 00:18:56.528 05:47:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.528 05:47:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:57.466 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.466 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:57.466 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:57.466 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:58.035 [2024-07-26 05:47:12.729059] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.035 05:47:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:58.295 05:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:58.295 "name": "Existed_Raid", 00:18:58.295 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:18:58.295 "strip_size_kb": 0, 00:18:58.295 "state": "configuring", 00:18:58.295 "raid_level": "raid1", 00:18:58.295 "superblock": true, 00:18:58.295 "num_base_bdevs": 3, 00:18:58.295 "num_base_bdevs_discovered": 1, 00:18:58.295 "num_base_bdevs_operational": 3, 00:18:58.295 "base_bdevs_list": [ 00:18:58.295 { 00:18:58.295 "name": "BaseBdev1", 00:18:58.295 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:18:58.295 "is_configured": true, 00:18:58.295 "data_offset": 2048, 00:18:58.295 "data_size": 63488 00:18:58.295 }, 00:18:58.295 { 00:18:58.295 "name": null, 00:18:58.295 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:18:58.295 "is_configured": false, 00:18:58.295 "data_offset": 2048, 00:18:58.295 "data_size": 63488 00:18:58.295 }, 00:18:58.295 { 00:18:58.295 "name": null, 00:18:58.295 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:18:58.295 "is_configured": false, 00:18:58.295 "data_offset": 2048, 00:18:58.295 "data_size": 63488 00:18:58.295 } 00:18:58.295 ] 00:18:58.295 }' 00:18:58.295 05:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:58.295 05:47:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:58.864 05:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.864 05:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:59.123 05:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:59.123 05:47:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:59.692 [2024-07-26 05:47:14.349372] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.692 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.951 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.951 "name": "Existed_Raid", 00:18:59.951 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:18:59.951 "strip_size_kb": 0, 00:18:59.951 "state": "configuring", 00:18:59.951 "raid_level": "raid1", 00:18:59.951 "superblock": true, 00:18:59.951 "num_base_bdevs": 3, 00:18:59.951 "num_base_bdevs_discovered": 2, 00:18:59.951 "num_base_bdevs_operational": 3, 00:18:59.951 "base_bdevs_list": [ 00:18:59.951 { 00:18:59.951 "name": "BaseBdev1", 00:18:59.951 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:18:59.951 "is_configured": true, 00:18:59.951 "data_offset": 2048, 00:18:59.951 "data_size": 63488 00:18:59.951 }, 00:18:59.951 { 00:18:59.951 "name": null, 00:18:59.951 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:18:59.951 "is_configured": false, 00:18:59.951 "data_offset": 2048, 00:18:59.951 "data_size": 63488 00:18:59.951 }, 00:18:59.951 { 00:18:59.951 "name": "BaseBdev3", 00:18:59.951 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:18:59.951 "is_configured": true, 00:18:59.951 "data_offset": 2048, 00:18:59.951 "data_size": 63488 00:18:59.951 } 00:18:59.951 ] 00:18:59.951 }' 00:18:59.951 05:47:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.951 05:47:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:00.520 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.520 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:00.520 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:00.520 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:00.779 [2024-07-26 05:47:15.644837] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:00.779 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:00.779 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:00.779 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:00.779 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:00.779 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:00.779 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:00.779 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.779 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.780 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.780 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.780 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.039 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.039 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.039 "name": "Existed_Raid", 00:19:01.039 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:19:01.039 "strip_size_kb": 0, 00:19:01.039 "state": "configuring", 00:19:01.039 "raid_level": "raid1", 00:19:01.039 "superblock": true, 00:19:01.039 "num_base_bdevs": 3, 00:19:01.039 "num_base_bdevs_discovered": 1, 00:19:01.039 "num_base_bdevs_operational": 3, 00:19:01.039 "base_bdevs_list": [ 00:19:01.039 { 00:19:01.039 "name": null, 00:19:01.039 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:19:01.039 "is_configured": false, 00:19:01.039 "data_offset": 2048, 00:19:01.039 "data_size": 63488 00:19:01.039 }, 00:19:01.039 { 00:19:01.039 "name": null, 00:19:01.039 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:19:01.039 "is_configured": false, 00:19:01.039 "data_offset": 2048, 00:19:01.039 "data_size": 63488 00:19:01.039 }, 00:19:01.039 { 00:19:01.039 "name": "BaseBdev3", 00:19:01.039 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:19:01.039 "is_configured": true, 00:19:01.039 "data_offset": 2048, 00:19:01.039 "data_size": 63488 00:19:01.039 } 00:19:01.039 ] 00:19:01.039 }' 00:19:01.039 05:47:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.039 05:47:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:01.976 05:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.976 05:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:01.976 05:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:01.976 05:47:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:02.235 [2024-07-26 05:47:17.023002] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.235 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:02.493 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:02.493 "name": "Existed_Raid", 00:19:02.493 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:19:02.493 "strip_size_kb": 0, 00:19:02.493 "state": "configuring", 00:19:02.493 "raid_level": "raid1", 00:19:02.493 "superblock": true, 00:19:02.493 "num_base_bdevs": 3, 00:19:02.493 "num_base_bdevs_discovered": 2, 00:19:02.493 "num_base_bdevs_operational": 3, 00:19:02.493 "base_bdevs_list": [ 00:19:02.493 { 00:19:02.493 "name": null, 00:19:02.493 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:19:02.493 "is_configured": false, 00:19:02.493 "data_offset": 2048, 00:19:02.493 "data_size": 63488 00:19:02.493 }, 00:19:02.493 { 00:19:02.493 "name": "BaseBdev2", 00:19:02.493 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:19:02.493 "is_configured": true, 00:19:02.493 "data_offset": 2048, 00:19:02.493 "data_size": 63488 00:19:02.493 }, 00:19:02.493 { 00:19:02.493 "name": "BaseBdev3", 00:19:02.493 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:19:02.493 "is_configured": true, 00:19:02.493 "data_offset": 2048, 00:19:02.493 "data_size": 63488 00:19:02.493 } 00:19:02.493 ] 00:19:02.493 }' 00:19:02.493 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.493 05:47:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:03.061 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.061 05:47:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:03.319 05:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:03.319 05:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.319 05:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:03.578 05:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a9510d9c-900a-4556-b598-4152a7c37e1b 00:19:03.836 [2024-07-26 05:47:18.618555] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:03.836 [2024-07-26 05:47:18.618714] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e2e1b0 00:19:03.836 [2024-07-26 05:47:18.618728] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:03.837 [2024-07-26 05:47:18.618903] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fea4f0 00:19:03.837 [2024-07-26 05:47:18.619021] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e2e1b0 00:19:03.837 [2024-07-26 05:47:18.619031] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e2e1b0 00:19:03.837 [2024-07-26 05:47:18.619125] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:03.837 NewBaseBdev 00:19:03.837 05:47:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:03.837 05:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:03.837 05:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:03.837 05:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:03.837 05:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:03.837 05:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:03.837 05:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:04.095 05:47:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:04.355 [ 00:19:04.355 { 00:19:04.355 "name": "NewBaseBdev", 00:19:04.355 "aliases": [ 00:19:04.355 "a9510d9c-900a-4556-b598-4152a7c37e1b" 00:19:04.355 ], 00:19:04.355 "product_name": "Malloc disk", 00:19:04.355 "block_size": 512, 00:19:04.355 "num_blocks": 65536, 00:19:04.355 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:19:04.355 "assigned_rate_limits": { 00:19:04.355 "rw_ios_per_sec": 0, 00:19:04.355 "rw_mbytes_per_sec": 0, 00:19:04.355 "r_mbytes_per_sec": 0, 00:19:04.355 "w_mbytes_per_sec": 0 00:19:04.355 }, 00:19:04.355 "claimed": true, 00:19:04.355 "claim_type": "exclusive_write", 00:19:04.355 "zoned": false, 00:19:04.355 "supported_io_types": { 00:19:04.355 "read": true, 00:19:04.355 "write": true, 00:19:04.355 "unmap": true, 00:19:04.355 "flush": true, 00:19:04.355 "reset": true, 00:19:04.355 "nvme_admin": false, 00:19:04.355 "nvme_io": false, 00:19:04.355 "nvme_io_md": false, 00:19:04.355 "write_zeroes": true, 00:19:04.355 "zcopy": true, 00:19:04.355 "get_zone_info": false, 00:19:04.355 "zone_management": false, 00:19:04.355 "zone_append": false, 00:19:04.355 "compare": false, 00:19:04.355 "compare_and_write": false, 00:19:04.355 "abort": true, 00:19:04.355 "seek_hole": false, 00:19:04.355 "seek_data": false, 00:19:04.355 "copy": true, 00:19:04.355 "nvme_iov_md": false 00:19:04.355 }, 00:19:04.355 "memory_domains": [ 00:19:04.355 { 00:19:04.355 "dma_device_id": "system", 00:19:04.355 "dma_device_type": 1 00:19:04.355 }, 00:19:04.355 { 00:19:04.355 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.355 "dma_device_type": 2 00:19:04.355 } 00:19:04.355 ], 00:19:04.355 "driver_specific": {} 00:19:04.355 } 00:19:04.355 ] 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.355 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.614 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.614 "name": "Existed_Raid", 00:19:04.614 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:19:04.614 "strip_size_kb": 0, 00:19:04.614 "state": "online", 00:19:04.614 "raid_level": "raid1", 00:19:04.614 "superblock": true, 00:19:04.614 "num_base_bdevs": 3, 00:19:04.614 "num_base_bdevs_discovered": 3, 00:19:04.614 "num_base_bdevs_operational": 3, 00:19:04.614 "base_bdevs_list": [ 00:19:04.614 { 00:19:04.614 "name": "NewBaseBdev", 00:19:04.614 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:19:04.614 "is_configured": true, 00:19:04.614 "data_offset": 2048, 00:19:04.614 "data_size": 63488 00:19:04.614 }, 00:19:04.614 { 00:19:04.614 "name": "BaseBdev2", 00:19:04.614 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:19:04.614 "is_configured": true, 00:19:04.614 "data_offset": 2048, 00:19:04.614 "data_size": 63488 00:19:04.614 }, 00:19:04.614 { 00:19:04.614 "name": "BaseBdev3", 00:19:04.614 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:19:04.614 "is_configured": true, 00:19:04.614 "data_offset": 2048, 00:19:04.614 "data_size": 63488 00:19:04.614 } 00:19:04.614 ] 00:19:04.614 }' 00:19:04.614 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.614 05:47:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:05.181 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:05.181 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:05.181 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:05.181 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:05.181 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:05.181 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:05.181 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:05.181 05:47:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:05.441 [2024-07-26 05:47:20.191052] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:05.441 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:05.441 "name": "Existed_Raid", 00:19:05.441 "aliases": [ 00:19:05.441 "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8" 00:19:05.441 ], 00:19:05.441 "product_name": "Raid Volume", 00:19:05.441 "block_size": 512, 00:19:05.441 "num_blocks": 63488, 00:19:05.441 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:19:05.441 "assigned_rate_limits": { 00:19:05.441 "rw_ios_per_sec": 0, 00:19:05.441 "rw_mbytes_per_sec": 0, 00:19:05.441 "r_mbytes_per_sec": 0, 00:19:05.441 "w_mbytes_per_sec": 0 00:19:05.441 }, 00:19:05.441 "claimed": false, 00:19:05.441 "zoned": false, 00:19:05.441 "supported_io_types": { 00:19:05.441 "read": true, 00:19:05.441 "write": true, 00:19:05.441 "unmap": false, 00:19:05.441 "flush": false, 00:19:05.441 "reset": true, 00:19:05.441 "nvme_admin": false, 00:19:05.441 "nvme_io": false, 00:19:05.441 "nvme_io_md": false, 00:19:05.441 "write_zeroes": true, 00:19:05.441 "zcopy": false, 00:19:05.441 "get_zone_info": false, 00:19:05.441 "zone_management": false, 00:19:05.441 "zone_append": false, 00:19:05.441 "compare": false, 00:19:05.441 "compare_and_write": false, 00:19:05.441 "abort": false, 00:19:05.441 "seek_hole": false, 00:19:05.441 "seek_data": false, 00:19:05.441 "copy": false, 00:19:05.441 "nvme_iov_md": false 00:19:05.441 }, 00:19:05.441 "memory_domains": [ 00:19:05.441 { 00:19:05.441 "dma_device_id": "system", 00:19:05.441 "dma_device_type": 1 00:19:05.441 }, 00:19:05.441 { 00:19:05.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.441 "dma_device_type": 2 00:19:05.441 }, 00:19:05.441 { 00:19:05.441 "dma_device_id": "system", 00:19:05.441 "dma_device_type": 1 00:19:05.441 }, 00:19:05.441 { 00:19:05.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.441 "dma_device_type": 2 00:19:05.441 }, 00:19:05.441 { 00:19:05.441 "dma_device_id": "system", 00:19:05.441 "dma_device_type": 1 00:19:05.441 }, 00:19:05.441 { 00:19:05.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.441 "dma_device_type": 2 00:19:05.441 } 00:19:05.441 ], 00:19:05.441 "driver_specific": { 00:19:05.441 "raid": { 00:19:05.441 "uuid": "a4dd69d9-4db7-4646-acb8-8b8afc5e5ba8", 00:19:05.441 "strip_size_kb": 0, 00:19:05.441 "state": "online", 00:19:05.441 "raid_level": "raid1", 00:19:05.441 "superblock": true, 00:19:05.441 "num_base_bdevs": 3, 00:19:05.441 "num_base_bdevs_discovered": 3, 00:19:05.441 "num_base_bdevs_operational": 3, 00:19:05.441 "base_bdevs_list": [ 00:19:05.441 { 00:19:05.441 "name": "NewBaseBdev", 00:19:05.441 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:19:05.441 "is_configured": true, 00:19:05.441 "data_offset": 2048, 00:19:05.441 "data_size": 63488 00:19:05.441 }, 00:19:05.441 { 00:19:05.441 "name": "BaseBdev2", 00:19:05.441 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:19:05.441 "is_configured": true, 00:19:05.441 "data_offset": 2048, 00:19:05.441 "data_size": 63488 00:19:05.441 }, 00:19:05.441 { 00:19:05.441 "name": "BaseBdev3", 00:19:05.441 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:19:05.441 "is_configured": true, 00:19:05.441 "data_offset": 2048, 00:19:05.441 "data_size": 63488 00:19:05.441 } 00:19:05.441 ] 00:19:05.441 } 00:19:05.441 } 00:19:05.441 }' 00:19:05.441 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:05.441 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:05.441 BaseBdev2 00:19:05.441 BaseBdev3' 00:19:05.441 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:05.441 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:05.441 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:05.700 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:05.700 "name": "NewBaseBdev", 00:19:05.700 "aliases": [ 00:19:05.700 "a9510d9c-900a-4556-b598-4152a7c37e1b" 00:19:05.700 ], 00:19:05.700 "product_name": "Malloc disk", 00:19:05.700 "block_size": 512, 00:19:05.700 "num_blocks": 65536, 00:19:05.700 "uuid": "a9510d9c-900a-4556-b598-4152a7c37e1b", 00:19:05.700 "assigned_rate_limits": { 00:19:05.700 "rw_ios_per_sec": 0, 00:19:05.700 "rw_mbytes_per_sec": 0, 00:19:05.700 "r_mbytes_per_sec": 0, 00:19:05.700 "w_mbytes_per_sec": 0 00:19:05.700 }, 00:19:05.700 "claimed": true, 00:19:05.700 "claim_type": "exclusive_write", 00:19:05.700 "zoned": false, 00:19:05.700 "supported_io_types": { 00:19:05.700 "read": true, 00:19:05.700 "write": true, 00:19:05.700 "unmap": true, 00:19:05.700 "flush": true, 00:19:05.700 "reset": true, 00:19:05.700 "nvme_admin": false, 00:19:05.700 "nvme_io": false, 00:19:05.700 "nvme_io_md": false, 00:19:05.700 "write_zeroes": true, 00:19:05.700 "zcopy": true, 00:19:05.700 "get_zone_info": false, 00:19:05.700 "zone_management": false, 00:19:05.700 "zone_append": false, 00:19:05.700 "compare": false, 00:19:05.700 "compare_and_write": false, 00:19:05.700 "abort": true, 00:19:05.700 "seek_hole": false, 00:19:05.700 "seek_data": false, 00:19:05.700 "copy": true, 00:19:05.700 "nvme_iov_md": false 00:19:05.700 }, 00:19:05.700 "memory_domains": [ 00:19:05.700 { 00:19:05.700 "dma_device_id": "system", 00:19:05.700 "dma_device_type": 1 00:19:05.700 }, 00:19:05.700 { 00:19:05.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.700 "dma_device_type": 2 00:19:05.700 } 00:19:05.700 ], 00:19:05.700 "driver_specific": {} 00:19:05.700 }' 00:19:05.700 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:05.700 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:05.700 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:05.700 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:05.958 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:06.217 05:47:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:06.217 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:06.217 "name": "BaseBdev2", 00:19:06.217 "aliases": [ 00:19:06.217 "65452d55-4dd3-44af-b1c3-6dd22379315f" 00:19:06.217 ], 00:19:06.217 "product_name": "Malloc disk", 00:19:06.217 "block_size": 512, 00:19:06.217 "num_blocks": 65536, 00:19:06.217 "uuid": "65452d55-4dd3-44af-b1c3-6dd22379315f", 00:19:06.217 "assigned_rate_limits": { 00:19:06.217 "rw_ios_per_sec": 0, 00:19:06.217 "rw_mbytes_per_sec": 0, 00:19:06.217 "r_mbytes_per_sec": 0, 00:19:06.217 "w_mbytes_per_sec": 0 00:19:06.217 }, 00:19:06.217 "claimed": true, 00:19:06.217 "claim_type": "exclusive_write", 00:19:06.217 "zoned": false, 00:19:06.217 "supported_io_types": { 00:19:06.217 "read": true, 00:19:06.217 "write": true, 00:19:06.217 "unmap": true, 00:19:06.217 "flush": true, 00:19:06.217 "reset": true, 00:19:06.217 "nvme_admin": false, 00:19:06.217 "nvme_io": false, 00:19:06.217 "nvme_io_md": false, 00:19:06.217 "write_zeroes": true, 00:19:06.217 "zcopy": true, 00:19:06.217 "get_zone_info": false, 00:19:06.217 "zone_management": false, 00:19:06.217 "zone_append": false, 00:19:06.217 "compare": false, 00:19:06.217 "compare_and_write": false, 00:19:06.217 "abort": true, 00:19:06.217 "seek_hole": false, 00:19:06.217 "seek_data": false, 00:19:06.217 "copy": true, 00:19:06.217 "nvme_iov_md": false 00:19:06.217 }, 00:19:06.217 "memory_domains": [ 00:19:06.217 { 00:19:06.217 "dma_device_id": "system", 00:19:06.217 "dma_device_type": 1 00:19:06.217 }, 00:19:06.217 { 00:19:06.217 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.217 "dma_device_type": 2 00:19:06.217 } 00:19:06.217 ], 00:19:06.217 "driver_specific": {} 00:19:06.217 }' 00:19:06.217 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:06.475 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:06.475 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:06.475 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:06.475 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:06.475 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:06.475 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:06.475 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:06.475 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:06.475 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:06.734 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:06.734 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:06.734 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:06.734 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:06.734 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:06.992 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:06.992 "name": "BaseBdev3", 00:19:06.992 "aliases": [ 00:19:06.992 "db882b7f-5bb5-45f8-8c89-0d813419ccc6" 00:19:06.992 ], 00:19:06.992 "product_name": "Malloc disk", 00:19:06.992 "block_size": 512, 00:19:06.992 "num_blocks": 65536, 00:19:06.992 "uuid": "db882b7f-5bb5-45f8-8c89-0d813419ccc6", 00:19:06.992 "assigned_rate_limits": { 00:19:06.992 "rw_ios_per_sec": 0, 00:19:06.992 "rw_mbytes_per_sec": 0, 00:19:06.992 "r_mbytes_per_sec": 0, 00:19:06.992 "w_mbytes_per_sec": 0 00:19:06.992 }, 00:19:06.992 "claimed": true, 00:19:06.992 "claim_type": "exclusive_write", 00:19:06.992 "zoned": false, 00:19:06.992 "supported_io_types": { 00:19:06.992 "read": true, 00:19:06.992 "write": true, 00:19:06.992 "unmap": true, 00:19:06.992 "flush": true, 00:19:06.992 "reset": true, 00:19:06.992 "nvme_admin": false, 00:19:06.992 "nvme_io": false, 00:19:06.992 "nvme_io_md": false, 00:19:06.992 "write_zeroes": true, 00:19:06.992 "zcopy": true, 00:19:06.992 "get_zone_info": false, 00:19:06.993 "zone_management": false, 00:19:06.993 "zone_append": false, 00:19:06.993 "compare": false, 00:19:06.993 "compare_and_write": false, 00:19:06.993 "abort": true, 00:19:06.993 "seek_hole": false, 00:19:06.993 "seek_data": false, 00:19:06.993 "copy": true, 00:19:06.993 "nvme_iov_md": false 00:19:06.993 }, 00:19:06.993 "memory_domains": [ 00:19:06.993 { 00:19:06.993 "dma_device_id": "system", 00:19:06.993 "dma_device_type": 1 00:19:06.993 }, 00:19:06.993 { 00:19:06.993 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.993 "dma_device_type": 2 00:19:06.993 } 00:19:06.993 ], 00:19:06.993 "driver_specific": {} 00:19:06.993 }' 00:19:06.993 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:06.993 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:06.993 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:06.993 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:06.993 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:06.993 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:06.993 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:07.251 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:07.251 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:07.251 05:47:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.251 05:47:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.251 05:47:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:07.251 05:47:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:07.509 [2024-07-26 05:47:22.296363] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:07.509 [2024-07-26 05:47:22.296388] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:07.509 [2024-07-26 05:47:22.296435] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:07.509 [2024-07-26 05:47:22.296719] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:07.509 [2024-07-26 05:47:22.296732] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e2e1b0 name Existed_Raid, state offline 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1178346 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1178346 ']' 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1178346 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1178346 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1178346' 00:19:07.509 killing process with pid 1178346 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1178346 00:19:07.509 [2024-07-26 05:47:22.365453] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:07.509 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1178346 00:19:07.509 [2024-07-26 05:47:22.396034] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:07.768 05:47:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:07.768 00:19:07.768 real 0m29.210s 00:19:07.768 user 0m53.630s 00:19:07.768 sys 0m5.146s 00:19:07.768 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:07.768 05:47:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:07.768 ************************************ 00:19:07.768 END TEST raid_state_function_test_sb 00:19:07.768 ************************************ 00:19:07.768 05:47:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:07.768 05:47:22 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:19:07.768 05:47:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:19:07.768 05:47:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:07.768 05:47:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:08.026 ************************************ 00:19:08.026 START TEST raid_superblock_test 00:19:08.026 ************************************ 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1182652 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1182652 /var/tmp/spdk-raid.sock 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1182652 ']' 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:08.026 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:08.026 05:47:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.026 [2024-07-26 05:47:22.748799] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:19:08.026 [2024-07-26 05:47:22.748861] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1182652 ] 00:19:08.026 [2024-07-26 05:47:22.879959] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:08.283 [2024-07-26 05:47:22.987947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:08.283 [2024-07-26 05:47:23.058263] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:08.283 [2024-07-26 05:47:23.058301] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:08.859 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:09.117 malloc1 00:19:09.117 05:47:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:09.389 [2024-07-26 05:47:24.157725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:09.389 [2024-07-26 05:47:24.157773] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:09.389 [2024-07-26 05:47:24.157794] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdff570 00:19:09.389 [2024-07-26 05:47:24.157807] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:09.389 [2024-07-26 05:47:24.159517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:09.389 [2024-07-26 05:47:24.159546] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:09.389 pt1 00:19:09.389 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:09.389 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:09.389 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:09.389 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:09.389 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:09.389 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:09.389 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:09.389 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:09.389 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:09.647 malloc2 00:19:09.647 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:09.906 [2024-07-26 05:47:24.661067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:09.906 [2024-07-26 05:47:24.661111] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:09.906 [2024-07-26 05:47:24.661129] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe00970 00:19:09.906 [2024-07-26 05:47:24.661142] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:09.906 [2024-07-26 05:47:24.662785] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:09.906 [2024-07-26 05:47:24.662814] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:09.906 pt2 00:19:09.906 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:09.906 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:09.906 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:09.906 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:09.906 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:09.906 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:09.906 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:09.906 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:09.906 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:10.164 malloc3 00:19:10.164 05:47:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:10.422 [2024-07-26 05:47:25.159741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:10.422 [2024-07-26 05:47:25.159787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:10.422 [2024-07-26 05:47:25.159805] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf97340 00:19:10.422 [2024-07-26 05:47:25.159817] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:10.422 [2024-07-26 05:47:25.161363] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:10.422 [2024-07-26 05:47:25.161390] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:10.422 pt3 00:19:10.422 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:10.422 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:10.422 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:19:10.680 [2024-07-26 05:47:25.404398] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:10.680 [2024-07-26 05:47:25.405741] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:10.680 [2024-07-26 05:47:25.405797] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:10.680 [2024-07-26 05:47:25.405950] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xdf7ea0 00:19:10.680 [2024-07-26 05:47:25.405961] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:10.680 [2024-07-26 05:47:25.406159] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdff240 00:19:10.680 [2024-07-26 05:47:25.406306] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdf7ea0 00:19:10.680 [2024-07-26 05:47:25.406317] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdf7ea0 00:19:10.680 [2024-07-26 05:47:25.406415] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.680 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:10.938 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:10.938 "name": "raid_bdev1", 00:19:10.938 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:10.938 "strip_size_kb": 0, 00:19:10.938 "state": "online", 00:19:10.938 "raid_level": "raid1", 00:19:10.938 "superblock": true, 00:19:10.938 "num_base_bdevs": 3, 00:19:10.938 "num_base_bdevs_discovered": 3, 00:19:10.938 "num_base_bdevs_operational": 3, 00:19:10.938 "base_bdevs_list": [ 00:19:10.938 { 00:19:10.938 "name": "pt1", 00:19:10.938 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:10.938 "is_configured": true, 00:19:10.938 "data_offset": 2048, 00:19:10.938 "data_size": 63488 00:19:10.938 }, 00:19:10.938 { 00:19:10.938 "name": "pt2", 00:19:10.938 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:10.938 "is_configured": true, 00:19:10.938 "data_offset": 2048, 00:19:10.938 "data_size": 63488 00:19:10.938 }, 00:19:10.938 { 00:19:10.938 "name": "pt3", 00:19:10.938 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:10.938 "is_configured": true, 00:19:10.938 "data_offset": 2048, 00:19:10.938 "data_size": 63488 00:19:10.938 } 00:19:10.938 ] 00:19:10.938 }' 00:19:10.938 05:47:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:10.938 05:47:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.538 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:11.538 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:11.538 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:11.538 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:11.538 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:11.538 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:11.538 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:11.538 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:11.797 [2024-07-26 05:47:26.471481] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:11.797 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:11.797 "name": "raid_bdev1", 00:19:11.797 "aliases": [ 00:19:11.797 "79599b1f-f9e9-403c-9931-d46428dc5b80" 00:19:11.797 ], 00:19:11.797 "product_name": "Raid Volume", 00:19:11.797 "block_size": 512, 00:19:11.797 "num_blocks": 63488, 00:19:11.797 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:11.797 "assigned_rate_limits": { 00:19:11.797 "rw_ios_per_sec": 0, 00:19:11.797 "rw_mbytes_per_sec": 0, 00:19:11.797 "r_mbytes_per_sec": 0, 00:19:11.797 "w_mbytes_per_sec": 0 00:19:11.797 }, 00:19:11.797 "claimed": false, 00:19:11.797 "zoned": false, 00:19:11.797 "supported_io_types": { 00:19:11.797 "read": true, 00:19:11.797 "write": true, 00:19:11.797 "unmap": false, 00:19:11.797 "flush": false, 00:19:11.797 "reset": true, 00:19:11.797 "nvme_admin": false, 00:19:11.797 "nvme_io": false, 00:19:11.797 "nvme_io_md": false, 00:19:11.797 "write_zeroes": true, 00:19:11.797 "zcopy": false, 00:19:11.797 "get_zone_info": false, 00:19:11.797 "zone_management": false, 00:19:11.797 "zone_append": false, 00:19:11.797 "compare": false, 00:19:11.797 "compare_and_write": false, 00:19:11.797 "abort": false, 00:19:11.797 "seek_hole": false, 00:19:11.797 "seek_data": false, 00:19:11.797 "copy": false, 00:19:11.797 "nvme_iov_md": false 00:19:11.797 }, 00:19:11.797 "memory_domains": [ 00:19:11.797 { 00:19:11.797 "dma_device_id": "system", 00:19:11.797 "dma_device_type": 1 00:19:11.797 }, 00:19:11.797 { 00:19:11.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.797 "dma_device_type": 2 00:19:11.797 }, 00:19:11.797 { 00:19:11.797 "dma_device_id": "system", 00:19:11.797 "dma_device_type": 1 00:19:11.797 }, 00:19:11.797 { 00:19:11.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.797 "dma_device_type": 2 00:19:11.797 }, 00:19:11.797 { 00:19:11.797 "dma_device_id": "system", 00:19:11.797 "dma_device_type": 1 00:19:11.797 }, 00:19:11.797 { 00:19:11.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.797 "dma_device_type": 2 00:19:11.797 } 00:19:11.797 ], 00:19:11.797 "driver_specific": { 00:19:11.797 "raid": { 00:19:11.797 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:11.797 "strip_size_kb": 0, 00:19:11.797 "state": "online", 00:19:11.797 "raid_level": "raid1", 00:19:11.797 "superblock": true, 00:19:11.797 "num_base_bdevs": 3, 00:19:11.797 "num_base_bdevs_discovered": 3, 00:19:11.797 "num_base_bdevs_operational": 3, 00:19:11.797 "base_bdevs_list": [ 00:19:11.797 { 00:19:11.797 "name": "pt1", 00:19:11.797 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:11.797 "is_configured": true, 00:19:11.797 "data_offset": 2048, 00:19:11.797 "data_size": 63488 00:19:11.797 }, 00:19:11.797 { 00:19:11.797 "name": "pt2", 00:19:11.797 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:11.797 "is_configured": true, 00:19:11.797 "data_offset": 2048, 00:19:11.797 "data_size": 63488 00:19:11.797 }, 00:19:11.797 { 00:19:11.797 "name": "pt3", 00:19:11.797 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:11.797 "is_configured": true, 00:19:11.797 "data_offset": 2048, 00:19:11.797 "data_size": 63488 00:19:11.797 } 00:19:11.797 ] 00:19:11.797 } 00:19:11.797 } 00:19:11.797 }' 00:19:11.797 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:11.797 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:11.797 pt2 00:19:11.798 pt3' 00:19:11.798 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:11.798 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:11.798 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:12.056 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:12.056 "name": "pt1", 00:19:12.056 "aliases": [ 00:19:12.056 "00000000-0000-0000-0000-000000000001" 00:19:12.056 ], 00:19:12.056 "product_name": "passthru", 00:19:12.057 "block_size": 512, 00:19:12.057 "num_blocks": 65536, 00:19:12.057 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:12.057 "assigned_rate_limits": { 00:19:12.057 "rw_ios_per_sec": 0, 00:19:12.057 "rw_mbytes_per_sec": 0, 00:19:12.057 "r_mbytes_per_sec": 0, 00:19:12.057 "w_mbytes_per_sec": 0 00:19:12.057 }, 00:19:12.057 "claimed": true, 00:19:12.057 "claim_type": "exclusive_write", 00:19:12.057 "zoned": false, 00:19:12.057 "supported_io_types": { 00:19:12.057 "read": true, 00:19:12.057 "write": true, 00:19:12.057 "unmap": true, 00:19:12.057 "flush": true, 00:19:12.057 "reset": true, 00:19:12.057 "nvme_admin": false, 00:19:12.057 "nvme_io": false, 00:19:12.057 "nvme_io_md": false, 00:19:12.057 "write_zeroes": true, 00:19:12.057 "zcopy": true, 00:19:12.057 "get_zone_info": false, 00:19:12.057 "zone_management": false, 00:19:12.057 "zone_append": false, 00:19:12.057 "compare": false, 00:19:12.057 "compare_and_write": false, 00:19:12.057 "abort": true, 00:19:12.057 "seek_hole": false, 00:19:12.057 "seek_data": false, 00:19:12.057 "copy": true, 00:19:12.057 "nvme_iov_md": false 00:19:12.057 }, 00:19:12.057 "memory_domains": [ 00:19:12.057 { 00:19:12.057 "dma_device_id": "system", 00:19:12.057 "dma_device_type": 1 00:19:12.057 }, 00:19:12.057 { 00:19:12.057 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.057 "dma_device_type": 2 00:19:12.057 } 00:19:12.057 ], 00:19:12.057 "driver_specific": { 00:19:12.057 "passthru": { 00:19:12.057 "name": "pt1", 00:19:12.057 "base_bdev_name": "malloc1" 00:19:12.057 } 00:19:12.057 } 00:19:12.057 }' 00:19:12.057 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.057 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.057 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.057 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.057 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.057 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.315 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.315 05:47:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.315 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.315 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.315 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.315 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.315 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:12.315 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:12.315 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:12.574 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:12.574 "name": "pt2", 00:19:12.574 "aliases": [ 00:19:12.574 "00000000-0000-0000-0000-000000000002" 00:19:12.574 ], 00:19:12.574 "product_name": "passthru", 00:19:12.574 "block_size": 512, 00:19:12.574 "num_blocks": 65536, 00:19:12.574 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:12.574 "assigned_rate_limits": { 00:19:12.574 "rw_ios_per_sec": 0, 00:19:12.574 "rw_mbytes_per_sec": 0, 00:19:12.574 "r_mbytes_per_sec": 0, 00:19:12.574 "w_mbytes_per_sec": 0 00:19:12.574 }, 00:19:12.574 "claimed": true, 00:19:12.574 "claim_type": "exclusive_write", 00:19:12.574 "zoned": false, 00:19:12.574 "supported_io_types": { 00:19:12.574 "read": true, 00:19:12.574 "write": true, 00:19:12.574 "unmap": true, 00:19:12.574 "flush": true, 00:19:12.574 "reset": true, 00:19:12.574 "nvme_admin": false, 00:19:12.574 "nvme_io": false, 00:19:12.574 "nvme_io_md": false, 00:19:12.574 "write_zeroes": true, 00:19:12.574 "zcopy": true, 00:19:12.574 "get_zone_info": false, 00:19:12.574 "zone_management": false, 00:19:12.574 "zone_append": false, 00:19:12.574 "compare": false, 00:19:12.574 "compare_and_write": false, 00:19:12.574 "abort": true, 00:19:12.574 "seek_hole": false, 00:19:12.574 "seek_data": false, 00:19:12.574 "copy": true, 00:19:12.574 "nvme_iov_md": false 00:19:12.574 }, 00:19:12.574 "memory_domains": [ 00:19:12.574 { 00:19:12.574 "dma_device_id": "system", 00:19:12.574 "dma_device_type": 1 00:19:12.574 }, 00:19:12.574 { 00:19:12.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.574 "dma_device_type": 2 00:19:12.574 } 00:19:12.574 ], 00:19:12.574 "driver_specific": { 00:19:12.574 "passthru": { 00:19:12.574 "name": "pt2", 00:19:12.574 "base_bdev_name": "malloc2" 00:19:12.574 } 00:19:12.574 } 00:19:12.574 }' 00:19:12.574 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.574 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.574 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.574 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.832 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.832 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.832 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.832 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.832 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.832 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.832 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.832 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.832 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:12.833 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:12.833 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:13.091 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:13.091 "name": "pt3", 00:19:13.091 "aliases": [ 00:19:13.091 "00000000-0000-0000-0000-000000000003" 00:19:13.091 ], 00:19:13.091 "product_name": "passthru", 00:19:13.091 "block_size": 512, 00:19:13.091 "num_blocks": 65536, 00:19:13.091 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:13.091 "assigned_rate_limits": { 00:19:13.091 "rw_ios_per_sec": 0, 00:19:13.091 "rw_mbytes_per_sec": 0, 00:19:13.091 "r_mbytes_per_sec": 0, 00:19:13.091 "w_mbytes_per_sec": 0 00:19:13.091 }, 00:19:13.091 "claimed": true, 00:19:13.091 "claim_type": "exclusive_write", 00:19:13.091 "zoned": false, 00:19:13.091 "supported_io_types": { 00:19:13.091 "read": true, 00:19:13.091 "write": true, 00:19:13.091 "unmap": true, 00:19:13.091 "flush": true, 00:19:13.091 "reset": true, 00:19:13.091 "nvme_admin": false, 00:19:13.091 "nvme_io": false, 00:19:13.091 "nvme_io_md": false, 00:19:13.091 "write_zeroes": true, 00:19:13.091 "zcopy": true, 00:19:13.091 "get_zone_info": false, 00:19:13.091 "zone_management": false, 00:19:13.091 "zone_append": false, 00:19:13.091 "compare": false, 00:19:13.091 "compare_and_write": false, 00:19:13.091 "abort": true, 00:19:13.091 "seek_hole": false, 00:19:13.091 "seek_data": false, 00:19:13.091 "copy": true, 00:19:13.091 "nvme_iov_md": false 00:19:13.091 }, 00:19:13.091 "memory_domains": [ 00:19:13.091 { 00:19:13.091 "dma_device_id": "system", 00:19:13.091 "dma_device_type": 1 00:19:13.091 }, 00:19:13.091 { 00:19:13.091 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.091 "dma_device_type": 2 00:19:13.091 } 00:19:13.091 ], 00:19:13.091 "driver_specific": { 00:19:13.091 "passthru": { 00:19:13.091 "name": "pt3", 00:19:13.091 "base_bdev_name": "malloc3" 00:19:13.091 } 00:19:13.091 } 00:19:13.091 }' 00:19:13.092 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.092 05:47:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.350 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:13.350 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.350 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.350 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.350 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.350 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.350 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:13.350 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.609 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.609 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:13.609 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:13.609 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:13.867 [2024-07-26 05:47:28.528942] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:13.867 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=79599b1f-f9e9-403c-9931-d46428dc5b80 00:19:13.867 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 79599b1f-f9e9-403c-9931-d46428dc5b80 ']' 00:19:13.867 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:14.126 [2024-07-26 05:47:28.777330] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:14.126 [2024-07-26 05:47:28.777351] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:14.126 [2024-07-26 05:47:28.777397] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:14.126 [2024-07-26 05:47:28.777467] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:14.126 [2024-07-26 05:47:28.777479] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdf7ea0 name raid_bdev1, state offline 00:19:14.126 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.126 05:47:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:14.385 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:14.385 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:14.385 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:14.385 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:14.385 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:14.385 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:14.644 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:14.644 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:14.902 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:14.902 05:47:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:15.161 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:19:15.419 [2024-07-26 05:47:30.241179] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:15.419 [2024-07-26 05:47:30.242524] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:15.419 [2024-07-26 05:47:30.242568] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:15.419 [2024-07-26 05:47:30.242612] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:15.419 [2024-07-26 05:47:30.242658] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:15.419 [2024-07-26 05:47:30.242681] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:15.419 [2024-07-26 05:47:30.242699] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:15.419 [2024-07-26 05:47:30.242708] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfa2ff0 name raid_bdev1, state configuring 00:19:15.419 request: 00:19:15.419 { 00:19:15.419 "name": "raid_bdev1", 00:19:15.419 "raid_level": "raid1", 00:19:15.419 "base_bdevs": [ 00:19:15.419 "malloc1", 00:19:15.419 "malloc2", 00:19:15.419 "malloc3" 00:19:15.419 ], 00:19:15.419 "superblock": false, 00:19:15.419 "method": "bdev_raid_create", 00:19:15.419 "req_id": 1 00:19:15.419 } 00:19:15.419 Got JSON-RPC error response 00:19:15.419 response: 00:19:15.419 { 00:19:15.419 "code": -17, 00:19:15.419 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:15.419 } 00:19:15.419 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:15.419 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:15.419 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:15.419 05:47:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:15.419 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.419 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:15.677 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:15.677 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:15.677 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:15.935 [2024-07-26 05:47:30.734416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:15.935 [2024-07-26 05:47:30.734450] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:15.935 [2024-07-26 05:47:30.734471] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdff7a0 00:19:15.935 [2024-07-26 05:47:30.734484] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:15.935 [2024-07-26 05:47:30.735981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:15.935 [2024-07-26 05:47:30.736010] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:15.935 [2024-07-26 05:47:30.736070] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:15.935 [2024-07-26 05:47:30.736095] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:15.935 pt1 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.936 05:47:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:16.194 05:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.194 "name": "raid_bdev1", 00:19:16.194 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:16.194 "strip_size_kb": 0, 00:19:16.194 "state": "configuring", 00:19:16.194 "raid_level": "raid1", 00:19:16.194 "superblock": true, 00:19:16.194 "num_base_bdevs": 3, 00:19:16.194 "num_base_bdevs_discovered": 1, 00:19:16.194 "num_base_bdevs_operational": 3, 00:19:16.194 "base_bdevs_list": [ 00:19:16.194 { 00:19:16.194 "name": "pt1", 00:19:16.194 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:16.194 "is_configured": true, 00:19:16.194 "data_offset": 2048, 00:19:16.194 "data_size": 63488 00:19:16.194 }, 00:19:16.194 { 00:19:16.194 "name": null, 00:19:16.194 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:16.194 "is_configured": false, 00:19:16.194 "data_offset": 2048, 00:19:16.194 "data_size": 63488 00:19:16.194 }, 00:19:16.194 { 00:19:16.194 "name": null, 00:19:16.194 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:16.194 "is_configured": false, 00:19:16.194 "data_offset": 2048, 00:19:16.194 "data_size": 63488 00:19:16.194 } 00:19:16.194 ] 00:19:16.194 }' 00:19:16.194 05:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.194 05:47:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.762 05:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:19:16.762 05:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:17.021 [2024-07-26 05:47:31.829325] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:17.021 [2024-07-26 05:47:31.829372] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:17.021 [2024-07-26 05:47:31.829392] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf6a10 00:19:17.021 [2024-07-26 05:47:31.829404] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:17.021 [2024-07-26 05:47:31.829752] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:17.021 [2024-07-26 05:47:31.829769] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:17.021 [2024-07-26 05:47:31.829828] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:17.021 [2024-07-26 05:47:31.829847] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:17.021 pt2 00:19:17.021 05:47:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:17.280 [2024-07-26 05:47:32.073971] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.280 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:17.538 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.538 "name": "raid_bdev1", 00:19:17.538 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:17.538 "strip_size_kb": 0, 00:19:17.538 "state": "configuring", 00:19:17.538 "raid_level": "raid1", 00:19:17.538 "superblock": true, 00:19:17.538 "num_base_bdevs": 3, 00:19:17.538 "num_base_bdevs_discovered": 1, 00:19:17.538 "num_base_bdevs_operational": 3, 00:19:17.538 "base_bdevs_list": [ 00:19:17.538 { 00:19:17.538 "name": "pt1", 00:19:17.538 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:17.538 "is_configured": true, 00:19:17.538 "data_offset": 2048, 00:19:17.538 "data_size": 63488 00:19:17.538 }, 00:19:17.538 { 00:19:17.538 "name": null, 00:19:17.538 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:17.538 "is_configured": false, 00:19:17.538 "data_offset": 2048, 00:19:17.538 "data_size": 63488 00:19:17.538 }, 00:19:17.538 { 00:19:17.538 "name": null, 00:19:17.538 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:17.538 "is_configured": false, 00:19:17.538 "data_offset": 2048, 00:19:17.538 "data_size": 63488 00:19:17.539 } 00:19:17.539 ] 00:19:17.539 }' 00:19:17.539 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.539 05:47:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:18.105 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:18.105 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:18.105 05:47:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:18.364 [2024-07-26 05:47:33.164861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:18.364 [2024-07-26 05:47:33.164906] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:18.364 [2024-07-26 05:47:33.164929] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdffa10 00:19:18.364 [2024-07-26 05:47:33.164941] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:18.364 [2024-07-26 05:47:33.165273] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:18.364 [2024-07-26 05:47:33.165290] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:18.364 [2024-07-26 05:47:33.165347] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:18.364 [2024-07-26 05:47:33.165366] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:18.364 pt2 00:19:18.364 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:18.364 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:18.364 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:18.623 [2024-07-26 05:47:33.345341] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:18.623 [2024-07-26 05:47:33.345374] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:18.623 [2024-07-26 05:47:33.345391] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf66c0 00:19:18.623 [2024-07-26 05:47:33.345404] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:18.623 [2024-07-26 05:47:33.345698] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:18.623 [2024-07-26 05:47:33.345716] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:18.623 [2024-07-26 05:47:33.345766] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:18.623 [2024-07-26 05:47:33.345783] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:18.623 [2024-07-26 05:47:33.345886] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf99c00 00:19:18.623 [2024-07-26 05:47:33.345897] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:18.623 [2024-07-26 05:47:33.346060] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdf9610 00:19:18.623 [2024-07-26 05:47:33.346187] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf99c00 00:19:18.623 [2024-07-26 05:47:33.346197] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf99c00 00:19:18.623 [2024-07-26 05:47:33.346294] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:18.623 pt3 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.623 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:18.883 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.883 "name": "raid_bdev1", 00:19:18.883 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:18.883 "strip_size_kb": 0, 00:19:18.883 "state": "online", 00:19:18.883 "raid_level": "raid1", 00:19:18.883 "superblock": true, 00:19:18.883 "num_base_bdevs": 3, 00:19:18.883 "num_base_bdevs_discovered": 3, 00:19:18.883 "num_base_bdevs_operational": 3, 00:19:18.883 "base_bdevs_list": [ 00:19:18.883 { 00:19:18.883 "name": "pt1", 00:19:18.883 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:18.883 "is_configured": true, 00:19:18.883 "data_offset": 2048, 00:19:18.883 "data_size": 63488 00:19:18.883 }, 00:19:18.883 { 00:19:18.883 "name": "pt2", 00:19:18.883 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:18.883 "is_configured": true, 00:19:18.883 "data_offset": 2048, 00:19:18.883 "data_size": 63488 00:19:18.883 }, 00:19:18.883 { 00:19:18.883 "name": "pt3", 00:19:18.883 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:18.883 "is_configured": true, 00:19:18.883 "data_offset": 2048, 00:19:18.883 "data_size": 63488 00:19:18.883 } 00:19:18.883 ] 00:19:18.883 }' 00:19:18.883 05:47:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.883 05:47:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.449 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:19.449 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:19.449 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:19.449 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:19.449 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:19.449 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:19.449 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:19.449 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:19.708 [2024-07-26 05:47:34.436499] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:19.708 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:19.708 "name": "raid_bdev1", 00:19:19.708 "aliases": [ 00:19:19.708 "79599b1f-f9e9-403c-9931-d46428dc5b80" 00:19:19.708 ], 00:19:19.708 "product_name": "Raid Volume", 00:19:19.708 "block_size": 512, 00:19:19.708 "num_blocks": 63488, 00:19:19.708 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:19.708 "assigned_rate_limits": { 00:19:19.708 "rw_ios_per_sec": 0, 00:19:19.708 "rw_mbytes_per_sec": 0, 00:19:19.708 "r_mbytes_per_sec": 0, 00:19:19.708 "w_mbytes_per_sec": 0 00:19:19.708 }, 00:19:19.708 "claimed": false, 00:19:19.708 "zoned": false, 00:19:19.708 "supported_io_types": { 00:19:19.708 "read": true, 00:19:19.708 "write": true, 00:19:19.708 "unmap": false, 00:19:19.708 "flush": false, 00:19:19.708 "reset": true, 00:19:19.708 "nvme_admin": false, 00:19:19.708 "nvme_io": false, 00:19:19.708 "nvme_io_md": false, 00:19:19.708 "write_zeroes": true, 00:19:19.708 "zcopy": false, 00:19:19.708 "get_zone_info": false, 00:19:19.708 "zone_management": false, 00:19:19.708 "zone_append": false, 00:19:19.708 "compare": false, 00:19:19.708 "compare_and_write": false, 00:19:19.708 "abort": false, 00:19:19.708 "seek_hole": false, 00:19:19.708 "seek_data": false, 00:19:19.708 "copy": false, 00:19:19.708 "nvme_iov_md": false 00:19:19.708 }, 00:19:19.708 "memory_domains": [ 00:19:19.708 { 00:19:19.708 "dma_device_id": "system", 00:19:19.708 "dma_device_type": 1 00:19:19.708 }, 00:19:19.708 { 00:19:19.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.708 "dma_device_type": 2 00:19:19.708 }, 00:19:19.708 { 00:19:19.708 "dma_device_id": "system", 00:19:19.708 "dma_device_type": 1 00:19:19.708 }, 00:19:19.708 { 00:19:19.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.708 "dma_device_type": 2 00:19:19.708 }, 00:19:19.708 { 00:19:19.708 "dma_device_id": "system", 00:19:19.708 "dma_device_type": 1 00:19:19.708 }, 00:19:19.708 { 00:19:19.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.708 "dma_device_type": 2 00:19:19.708 } 00:19:19.708 ], 00:19:19.708 "driver_specific": { 00:19:19.708 "raid": { 00:19:19.708 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:19.708 "strip_size_kb": 0, 00:19:19.708 "state": "online", 00:19:19.708 "raid_level": "raid1", 00:19:19.708 "superblock": true, 00:19:19.708 "num_base_bdevs": 3, 00:19:19.708 "num_base_bdevs_discovered": 3, 00:19:19.708 "num_base_bdevs_operational": 3, 00:19:19.708 "base_bdevs_list": [ 00:19:19.708 { 00:19:19.708 "name": "pt1", 00:19:19.708 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:19.708 "is_configured": true, 00:19:19.708 "data_offset": 2048, 00:19:19.708 "data_size": 63488 00:19:19.708 }, 00:19:19.708 { 00:19:19.708 "name": "pt2", 00:19:19.708 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:19.708 "is_configured": true, 00:19:19.708 "data_offset": 2048, 00:19:19.708 "data_size": 63488 00:19:19.708 }, 00:19:19.708 { 00:19:19.708 "name": "pt3", 00:19:19.708 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:19.708 "is_configured": true, 00:19:19.708 "data_offset": 2048, 00:19:19.708 "data_size": 63488 00:19:19.708 } 00:19:19.708 ] 00:19:19.708 } 00:19:19.708 } 00:19:19.708 }' 00:19:19.708 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:19.708 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:19.708 pt2 00:19:19.708 pt3' 00:19:19.708 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:19.708 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:19.708 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:19.967 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:19.967 "name": "pt1", 00:19:19.967 "aliases": [ 00:19:19.967 "00000000-0000-0000-0000-000000000001" 00:19:19.967 ], 00:19:19.967 "product_name": "passthru", 00:19:19.967 "block_size": 512, 00:19:19.967 "num_blocks": 65536, 00:19:19.967 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:19.967 "assigned_rate_limits": { 00:19:19.967 "rw_ios_per_sec": 0, 00:19:19.967 "rw_mbytes_per_sec": 0, 00:19:19.967 "r_mbytes_per_sec": 0, 00:19:19.967 "w_mbytes_per_sec": 0 00:19:19.967 }, 00:19:19.967 "claimed": true, 00:19:19.967 "claim_type": "exclusive_write", 00:19:19.967 "zoned": false, 00:19:19.967 "supported_io_types": { 00:19:19.967 "read": true, 00:19:19.967 "write": true, 00:19:19.967 "unmap": true, 00:19:19.967 "flush": true, 00:19:19.967 "reset": true, 00:19:19.967 "nvme_admin": false, 00:19:19.967 "nvme_io": false, 00:19:19.967 "nvme_io_md": false, 00:19:19.967 "write_zeroes": true, 00:19:19.967 "zcopy": true, 00:19:19.967 "get_zone_info": false, 00:19:19.967 "zone_management": false, 00:19:19.967 "zone_append": false, 00:19:19.967 "compare": false, 00:19:19.967 "compare_and_write": false, 00:19:19.967 "abort": true, 00:19:19.967 "seek_hole": false, 00:19:19.967 "seek_data": false, 00:19:19.967 "copy": true, 00:19:19.967 "nvme_iov_md": false 00:19:19.967 }, 00:19:19.967 "memory_domains": [ 00:19:19.967 { 00:19:19.967 "dma_device_id": "system", 00:19:19.968 "dma_device_type": 1 00:19:19.968 }, 00:19:19.968 { 00:19:19.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.968 "dma_device_type": 2 00:19:19.968 } 00:19:19.968 ], 00:19:19.968 "driver_specific": { 00:19:19.968 "passthru": { 00:19:19.968 "name": "pt1", 00:19:19.968 "base_bdev_name": "malloc1" 00:19:19.968 } 00:19:19.968 } 00:19:19.968 }' 00:19:19.968 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:19.968 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:19.968 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:19.968 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.226 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.226 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:20.226 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:20.226 05:47:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:20.226 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:20.226 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:20.226 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:20.226 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:20.226 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:20.226 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:20.226 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:20.484 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:20.484 "name": "pt2", 00:19:20.484 "aliases": [ 00:19:20.484 "00000000-0000-0000-0000-000000000002" 00:19:20.484 ], 00:19:20.484 "product_name": "passthru", 00:19:20.484 "block_size": 512, 00:19:20.484 "num_blocks": 65536, 00:19:20.484 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:20.484 "assigned_rate_limits": { 00:19:20.484 "rw_ios_per_sec": 0, 00:19:20.484 "rw_mbytes_per_sec": 0, 00:19:20.484 "r_mbytes_per_sec": 0, 00:19:20.484 "w_mbytes_per_sec": 0 00:19:20.484 }, 00:19:20.484 "claimed": true, 00:19:20.484 "claim_type": "exclusive_write", 00:19:20.484 "zoned": false, 00:19:20.484 "supported_io_types": { 00:19:20.484 "read": true, 00:19:20.484 "write": true, 00:19:20.484 "unmap": true, 00:19:20.484 "flush": true, 00:19:20.484 "reset": true, 00:19:20.484 "nvme_admin": false, 00:19:20.484 "nvme_io": false, 00:19:20.484 "nvme_io_md": false, 00:19:20.484 "write_zeroes": true, 00:19:20.484 "zcopy": true, 00:19:20.484 "get_zone_info": false, 00:19:20.484 "zone_management": false, 00:19:20.484 "zone_append": false, 00:19:20.484 "compare": false, 00:19:20.484 "compare_and_write": false, 00:19:20.484 "abort": true, 00:19:20.484 "seek_hole": false, 00:19:20.484 "seek_data": false, 00:19:20.484 "copy": true, 00:19:20.484 "nvme_iov_md": false 00:19:20.484 }, 00:19:20.484 "memory_domains": [ 00:19:20.484 { 00:19:20.484 "dma_device_id": "system", 00:19:20.484 "dma_device_type": 1 00:19:20.484 }, 00:19:20.484 { 00:19:20.484 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.484 "dma_device_type": 2 00:19:20.484 } 00:19:20.484 ], 00:19:20.484 "driver_specific": { 00:19:20.484 "passthru": { 00:19:20.484 "name": "pt2", 00:19:20.484 "base_bdev_name": "malloc2" 00:19:20.484 } 00:19:20.484 } 00:19:20.484 }' 00:19:20.484 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:20.742 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:20.742 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:20.742 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.742 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.742 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:20.742 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:20.742 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:20.742 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:20.742 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.000 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.000 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:21.000 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:21.000 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:21.000 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:21.259 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:21.259 "name": "pt3", 00:19:21.259 "aliases": [ 00:19:21.259 "00000000-0000-0000-0000-000000000003" 00:19:21.259 ], 00:19:21.259 "product_name": "passthru", 00:19:21.259 "block_size": 512, 00:19:21.259 "num_blocks": 65536, 00:19:21.259 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:21.259 "assigned_rate_limits": { 00:19:21.259 "rw_ios_per_sec": 0, 00:19:21.259 "rw_mbytes_per_sec": 0, 00:19:21.259 "r_mbytes_per_sec": 0, 00:19:21.259 "w_mbytes_per_sec": 0 00:19:21.259 }, 00:19:21.259 "claimed": true, 00:19:21.259 "claim_type": "exclusive_write", 00:19:21.259 "zoned": false, 00:19:21.259 "supported_io_types": { 00:19:21.259 "read": true, 00:19:21.259 "write": true, 00:19:21.259 "unmap": true, 00:19:21.259 "flush": true, 00:19:21.259 "reset": true, 00:19:21.259 "nvme_admin": false, 00:19:21.259 "nvme_io": false, 00:19:21.259 "nvme_io_md": false, 00:19:21.259 "write_zeroes": true, 00:19:21.259 "zcopy": true, 00:19:21.259 "get_zone_info": false, 00:19:21.259 "zone_management": false, 00:19:21.259 "zone_append": false, 00:19:21.259 "compare": false, 00:19:21.259 "compare_and_write": false, 00:19:21.259 "abort": true, 00:19:21.259 "seek_hole": false, 00:19:21.259 "seek_data": false, 00:19:21.259 "copy": true, 00:19:21.259 "nvme_iov_md": false 00:19:21.259 }, 00:19:21.259 "memory_domains": [ 00:19:21.259 { 00:19:21.259 "dma_device_id": "system", 00:19:21.259 "dma_device_type": 1 00:19:21.259 }, 00:19:21.259 { 00:19:21.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.259 "dma_device_type": 2 00:19:21.259 } 00:19:21.259 ], 00:19:21.259 "driver_specific": { 00:19:21.259 "passthru": { 00:19:21.259 "name": "pt3", 00:19:21.259 "base_bdev_name": "malloc3" 00:19:21.259 } 00:19:21.259 } 00:19:21.259 }' 00:19:21.259 05:47:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.259 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.259 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:21.259 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.259 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.259 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:21.259 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.518 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.518 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:21.518 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.518 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.518 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:21.518 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:21.518 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:21.776 [2024-07-26 05:47:36.530039] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:21.776 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 79599b1f-f9e9-403c-9931-d46428dc5b80 '!=' 79599b1f-f9e9-403c-9931-d46428dc5b80 ']' 00:19:21.776 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:19:21.776 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:21.776 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:21.776 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:22.035 [2024-07-26 05:47:36.778451] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.035 05:47:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:22.294 05:47:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.294 "name": "raid_bdev1", 00:19:22.294 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:22.294 "strip_size_kb": 0, 00:19:22.294 "state": "online", 00:19:22.294 "raid_level": "raid1", 00:19:22.294 "superblock": true, 00:19:22.294 "num_base_bdevs": 3, 00:19:22.294 "num_base_bdevs_discovered": 2, 00:19:22.294 "num_base_bdevs_operational": 2, 00:19:22.294 "base_bdevs_list": [ 00:19:22.294 { 00:19:22.294 "name": null, 00:19:22.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.294 "is_configured": false, 00:19:22.294 "data_offset": 2048, 00:19:22.294 "data_size": 63488 00:19:22.294 }, 00:19:22.294 { 00:19:22.294 "name": "pt2", 00:19:22.294 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:22.294 "is_configured": true, 00:19:22.294 "data_offset": 2048, 00:19:22.294 "data_size": 63488 00:19:22.294 }, 00:19:22.294 { 00:19:22.294 "name": "pt3", 00:19:22.294 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:22.294 "is_configured": true, 00:19:22.294 "data_offset": 2048, 00:19:22.294 "data_size": 63488 00:19:22.294 } 00:19:22.294 ] 00:19:22.294 }' 00:19:22.294 05:47:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.294 05:47:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:22.861 05:47:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:23.120 [2024-07-26 05:47:37.869309] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:23.120 [2024-07-26 05:47:37.869332] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:23.120 [2024-07-26 05:47:37.869381] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:23.120 [2024-07-26 05:47:37.869433] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:23.120 [2024-07-26 05:47:37.869444] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf99c00 name raid_bdev1, state offline 00:19:23.120 05:47:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.120 05:47:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:19:23.378 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:19:23.378 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:19:23.378 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:19:23.378 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:23.378 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:23.637 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:23.637 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:23.637 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:23.897 [2024-07-26 05:47:38.719505] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:23.897 [2024-07-26 05:47:38.719544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:23.897 [2024-07-26 05:47:38.719561] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf7310 00:19:23.897 [2024-07-26 05:47:38.719574] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:23.897 [2024-07-26 05:47:38.721166] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:23.897 [2024-07-26 05:47:38.721195] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:23.897 [2024-07-26 05:47:38.721259] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:23.897 [2024-07-26 05:47:38.721286] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:23.897 pt2 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.897 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:24.156 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.156 "name": "raid_bdev1", 00:19:24.156 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:24.156 "strip_size_kb": 0, 00:19:24.156 "state": "configuring", 00:19:24.156 "raid_level": "raid1", 00:19:24.156 "superblock": true, 00:19:24.156 "num_base_bdevs": 3, 00:19:24.156 "num_base_bdevs_discovered": 1, 00:19:24.156 "num_base_bdevs_operational": 2, 00:19:24.156 "base_bdevs_list": [ 00:19:24.156 { 00:19:24.156 "name": null, 00:19:24.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.156 "is_configured": false, 00:19:24.156 "data_offset": 2048, 00:19:24.156 "data_size": 63488 00:19:24.156 }, 00:19:24.156 { 00:19:24.156 "name": "pt2", 00:19:24.156 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:24.156 "is_configured": true, 00:19:24.156 "data_offset": 2048, 00:19:24.156 "data_size": 63488 00:19:24.156 }, 00:19:24.156 { 00:19:24.156 "name": null, 00:19:24.156 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:24.156 "is_configured": false, 00:19:24.156 "data_offset": 2048, 00:19:24.156 "data_size": 63488 00:19:24.156 } 00:19:24.156 ] 00:19:24.156 }' 00:19:24.156 05:47:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.156 05:47:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:24.724 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:19:24.724 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:19:24.724 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:19:24.724 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:24.983 [2024-07-26 05:47:39.662017] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:24.983 [2024-07-26 05:47:39.662059] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:24.983 [2024-07-26 05:47:39.662078] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf5ec0 00:19:24.983 [2024-07-26 05:47:39.662090] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:24.983 [2024-07-26 05:47:39.662414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:24.983 [2024-07-26 05:47:39.662431] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:24.983 [2024-07-26 05:47:39.662487] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:24.983 [2024-07-26 05:47:39.662505] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:24.983 [2024-07-26 05:47:39.662602] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf97cc0 00:19:24.983 [2024-07-26 05:47:39.662612] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:24.983 [2024-07-26 05:47:39.662777] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf986d0 00:19:24.983 [2024-07-26 05:47:39.662898] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf97cc0 00:19:24.983 [2024-07-26 05:47:39.662908] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf97cc0 00:19:24.983 [2024-07-26 05:47:39.663003] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:24.983 pt3 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:24.983 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.242 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.242 "name": "raid_bdev1", 00:19:25.242 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:25.242 "strip_size_kb": 0, 00:19:25.242 "state": "online", 00:19:25.242 "raid_level": "raid1", 00:19:25.242 "superblock": true, 00:19:25.242 "num_base_bdevs": 3, 00:19:25.242 "num_base_bdevs_discovered": 2, 00:19:25.242 "num_base_bdevs_operational": 2, 00:19:25.242 "base_bdevs_list": [ 00:19:25.242 { 00:19:25.242 "name": null, 00:19:25.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.242 "is_configured": false, 00:19:25.242 "data_offset": 2048, 00:19:25.242 "data_size": 63488 00:19:25.242 }, 00:19:25.242 { 00:19:25.242 "name": "pt2", 00:19:25.242 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:25.242 "is_configured": true, 00:19:25.242 "data_offset": 2048, 00:19:25.242 "data_size": 63488 00:19:25.242 }, 00:19:25.242 { 00:19:25.242 "name": "pt3", 00:19:25.242 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:25.242 "is_configured": true, 00:19:25.242 "data_offset": 2048, 00:19:25.242 "data_size": 63488 00:19:25.242 } 00:19:25.242 ] 00:19:25.242 }' 00:19:25.242 05:47:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.242 05:47:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:25.810 05:47:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:26.378 [2024-07-26 05:47:41.009576] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:26.378 [2024-07-26 05:47:41.009604] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:26.378 [2024-07-26 05:47:41.009665] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:26.378 [2024-07-26 05:47:41.009718] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:26.378 [2024-07-26 05:47:41.009729] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf97cc0 name raid_bdev1, state offline 00:19:26.378 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.378 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:19:26.378 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:19:26.378 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:19:26.378 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:19:26.636 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:19:26.636 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:26.636 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:26.894 [2024-07-26 05:47:41.755508] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:26.894 [2024-07-26 05:47:41.755549] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:26.894 [2024-07-26 05:47:41.755565] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf5ec0 00:19:26.894 [2024-07-26 05:47:41.755578] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:26.894 [2024-07-26 05:47:41.757228] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:26.894 [2024-07-26 05:47:41.757257] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:26.894 [2024-07-26 05:47:41.757322] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:26.894 [2024-07-26 05:47:41.757347] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:26.894 [2024-07-26 05:47:41.757442] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:19:26.894 [2024-07-26 05:47:41.757455] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:26.894 [2024-07-26 05:47:41.757468] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf97f40 name raid_bdev1, state configuring 00:19:26.894 [2024-07-26 05:47:41.757491] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:26.894 pt1 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.894 05:47:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:27.153 05:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.153 "name": "raid_bdev1", 00:19:27.153 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:27.153 "strip_size_kb": 0, 00:19:27.153 "state": "configuring", 00:19:27.153 "raid_level": "raid1", 00:19:27.153 "superblock": true, 00:19:27.153 "num_base_bdevs": 3, 00:19:27.153 "num_base_bdevs_discovered": 1, 00:19:27.153 "num_base_bdevs_operational": 2, 00:19:27.153 "base_bdevs_list": [ 00:19:27.153 { 00:19:27.153 "name": null, 00:19:27.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:27.153 "is_configured": false, 00:19:27.153 "data_offset": 2048, 00:19:27.153 "data_size": 63488 00:19:27.153 }, 00:19:27.153 { 00:19:27.153 "name": "pt2", 00:19:27.153 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:27.153 "is_configured": true, 00:19:27.153 "data_offset": 2048, 00:19:27.153 "data_size": 63488 00:19:27.153 }, 00:19:27.153 { 00:19:27.153 "name": null, 00:19:27.153 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:27.153 "is_configured": false, 00:19:27.153 "data_offset": 2048, 00:19:27.153 "data_size": 63488 00:19:27.153 } 00:19:27.153 ] 00:19:27.153 }' 00:19:27.153 05:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.153 05:47:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:27.720 05:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:19:27.720 05:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:19:28.040 05:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:19:28.040 05:47:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:28.300 [2024-07-26 05:47:43.087222] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:28.300 [2024-07-26 05:47:43.087274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:28.300 [2024-07-26 05:47:43.087294] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf90c0 00:19:28.300 [2024-07-26 05:47:43.087306] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:28.300 [2024-07-26 05:47:43.087669] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:28.300 [2024-07-26 05:47:43.087688] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:28.300 [2024-07-26 05:47:43.087751] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:28.300 [2024-07-26 05:47:43.087770] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:28.300 [2024-07-26 05:47:43.087872] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xdf9a40 00:19:28.300 [2024-07-26 05:47:43.087883] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:28.300 [2024-07-26 05:47:43.088061] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf986c0 00:19:28.300 [2024-07-26 05:47:43.088189] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdf9a40 00:19:28.300 [2024-07-26 05:47:43.088199] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdf9a40 00:19:28.300 [2024-07-26 05:47:43.088295] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:28.300 pt3 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.300 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:28.559 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.559 "name": "raid_bdev1", 00:19:28.559 "uuid": "79599b1f-f9e9-403c-9931-d46428dc5b80", 00:19:28.559 "strip_size_kb": 0, 00:19:28.559 "state": "online", 00:19:28.559 "raid_level": "raid1", 00:19:28.559 "superblock": true, 00:19:28.559 "num_base_bdevs": 3, 00:19:28.559 "num_base_bdevs_discovered": 2, 00:19:28.559 "num_base_bdevs_operational": 2, 00:19:28.559 "base_bdevs_list": [ 00:19:28.559 { 00:19:28.559 "name": null, 00:19:28.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:28.559 "is_configured": false, 00:19:28.559 "data_offset": 2048, 00:19:28.559 "data_size": 63488 00:19:28.559 }, 00:19:28.559 { 00:19:28.559 "name": "pt2", 00:19:28.559 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:28.559 "is_configured": true, 00:19:28.559 "data_offset": 2048, 00:19:28.559 "data_size": 63488 00:19:28.559 }, 00:19:28.559 { 00:19:28.559 "name": "pt3", 00:19:28.559 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:28.559 "is_configured": true, 00:19:28.559 "data_offset": 2048, 00:19:28.559 "data_size": 63488 00:19:28.559 } 00:19:28.559 ] 00:19:28.559 }' 00:19:28.559 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.559 05:47:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:29.126 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:19:29.126 05:47:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:19:29.385 05:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:19:29.385 05:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:29.385 05:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:19:29.644 [2024-07-26 05:47:44.431037] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 79599b1f-f9e9-403c-9931-d46428dc5b80 '!=' 79599b1f-f9e9-403c-9931-d46428dc5b80 ']' 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1182652 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1182652 ']' 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1182652 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1182652 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1182652' 00:19:29.644 killing process with pid 1182652 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1182652 00:19:29.644 [2024-07-26 05:47:44.499598] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:29.644 [2024-07-26 05:47:44.499662] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:29.644 [2024-07-26 05:47:44.499717] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:29.644 [2024-07-26 05:47:44.499728] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdf9a40 name raid_bdev1, state offline 00:19:29.644 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1182652 00:19:29.644 [2024-07-26 05:47:44.530600] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:29.903 05:47:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:29.903 00:19:29.903 real 0m22.055s 00:19:29.903 user 0m40.240s 00:19:29.903 sys 0m4.048s 00:19:29.903 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:29.903 05:47:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:29.903 ************************************ 00:19:29.903 END TEST raid_superblock_test 00:19:29.903 ************************************ 00:19:29.904 05:47:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:29.904 05:47:44 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:19:29.904 05:47:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:29.904 05:47:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:29.904 05:47:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:30.163 ************************************ 00:19:30.163 START TEST raid_read_error_test 00:19:30.163 ************************************ 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Q460rvmy39 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1186076 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1186076 /var/tmp/spdk-raid.sock 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1186076 ']' 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:30.163 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:30.163 05:47:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.163 [2024-07-26 05:47:44.915316] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:19:30.163 [2024-07-26 05:47:44.915380] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1186076 ] 00:19:30.163 [2024-07-26 05:47:45.045548] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:30.422 [2024-07-26 05:47:45.152938] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:30.422 [2024-07-26 05:47:45.220107] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:30.422 [2024-07-26 05:47:45.220144] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:30.991 05:47:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:30.991 05:47:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:30.991 05:47:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:30.991 05:47:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:31.250 BaseBdev1_malloc 00:19:31.250 05:47:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:31.509 true 00:19:31.509 05:47:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:31.768 [2024-07-26 05:47:46.570042] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:31.768 [2024-07-26 05:47:46.570086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:31.768 [2024-07-26 05:47:46.570108] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ca0d0 00:19:31.768 [2024-07-26 05:47:46.570120] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:31.768 [2024-07-26 05:47:46.572013] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:31.768 [2024-07-26 05:47:46.572042] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:31.768 BaseBdev1 00:19:31.768 05:47:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:31.768 05:47:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:32.028 BaseBdev2_malloc 00:19:32.028 05:47:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:32.287 true 00:19:32.287 05:47:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:32.546 [2024-07-26 05:47:47.305789] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:32.546 [2024-07-26 05:47:47.305832] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:32.546 [2024-07-26 05:47:47.305854] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ce910 00:19:32.546 [2024-07-26 05:47:47.305867] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:32.546 [2024-07-26 05:47:47.307432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:32.546 [2024-07-26 05:47:47.307461] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:32.546 BaseBdev2 00:19:32.546 05:47:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:32.546 05:47:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:32.805 BaseBdev3_malloc 00:19:32.805 05:47:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:33.064 true 00:19:33.064 05:47:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:33.323 [2024-07-26 05:47:48.032287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:33.323 [2024-07-26 05:47:48.032330] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:33.323 [2024-07-26 05:47:48.032352] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14d0bd0 00:19:33.323 [2024-07-26 05:47:48.032365] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:33.323 [2024-07-26 05:47:48.033948] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:33.323 [2024-07-26 05:47:48.033975] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:33.323 BaseBdev3 00:19:33.323 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:19:33.581 [2024-07-26 05:47:48.264945] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:33.581 [2024-07-26 05:47:48.266280] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:33.581 [2024-07-26 05:47:48.266349] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:33.581 [2024-07-26 05:47:48.266566] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14d2280 00:19:33.581 [2024-07-26 05:47:48.266577] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:33.581 [2024-07-26 05:47:48.266785] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d1e20 00:19:33.581 [2024-07-26 05:47:48.266939] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14d2280 00:19:33.581 [2024-07-26 05:47:48.266949] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14d2280 00:19:33.581 [2024-07-26 05:47:48.267057] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.581 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:33.840 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.840 "name": "raid_bdev1", 00:19:33.840 "uuid": "456491ba-611b-4d5e-8896-40653b8d0c27", 00:19:33.840 "strip_size_kb": 0, 00:19:33.840 "state": "online", 00:19:33.840 "raid_level": "raid1", 00:19:33.840 "superblock": true, 00:19:33.840 "num_base_bdevs": 3, 00:19:33.840 "num_base_bdevs_discovered": 3, 00:19:33.840 "num_base_bdevs_operational": 3, 00:19:33.840 "base_bdevs_list": [ 00:19:33.840 { 00:19:33.840 "name": "BaseBdev1", 00:19:33.840 "uuid": "4df204c0-e5d8-5d67-af88-6e81d6e281f2", 00:19:33.840 "is_configured": true, 00:19:33.840 "data_offset": 2048, 00:19:33.840 "data_size": 63488 00:19:33.840 }, 00:19:33.840 { 00:19:33.840 "name": "BaseBdev2", 00:19:33.840 "uuid": "c403e854-8934-5397-8a51-90f19383be4e", 00:19:33.840 "is_configured": true, 00:19:33.840 "data_offset": 2048, 00:19:33.840 "data_size": 63488 00:19:33.840 }, 00:19:33.840 { 00:19:33.840 "name": "BaseBdev3", 00:19:33.840 "uuid": "0324dcd7-760a-5386-9bfc-12042acb56bd", 00:19:33.840 "is_configured": true, 00:19:33.840 "data_offset": 2048, 00:19:33.840 "data_size": 63488 00:19:33.840 } 00:19:33.840 ] 00:19:33.840 }' 00:19:33.840 05:47:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.840 05:47:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:34.406 05:47:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:34.406 05:47:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:34.406 [2024-07-26 05:47:49.243797] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x131fe00 00:19:35.342 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.601 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:35.859 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.859 "name": "raid_bdev1", 00:19:35.859 "uuid": "456491ba-611b-4d5e-8896-40653b8d0c27", 00:19:35.859 "strip_size_kb": 0, 00:19:35.859 "state": "online", 00:19:35.859 "raid_level": "raid1", 00:19:35.859 "superblock": true, 00:19:35.859 "num_base_bdevs": 3, 00:19:35.859 "num_base_bdevs_discovered": 3, 00:19:35.859 "num_base_bdevs_operational": 3, 00:19:35.859 "base_bdevs_list": [ 00:19:35.859 { 00:19:35.859 "name": "BaseBdev1", 00:19:35.859 "uuid": "4df204c0-e5d8-5d67-af88-6e81d6e281f2", 00:19:35.859 "is_configured": true, 00:19:35.859 "data_offset": 2048, 00:19:35.859 "data_size": 63488 00:19:35.859 }, 00:19:35.859 { 00:19:35.859 "name": "BaseBdev2", 00:19:35.859 "uuid": "c403e854-8934-5397-8a51-90f19383be4e", 00:19:35.859 "is_configured": true, 00:19:35.859 "data_offset": 2048, 00:19:35.859 "data_size": 63488 00:19:35.859 }, 00:19:35.859 { 00:19:35.859 "name": "BaseBdev3", 00:19:35.859 "uuid": "0324dcd7-760a-5386-9bfc-12042acb56bd", 00:19:35.859 "is_configured": true, 00:19:35.859 "data_offset": 2048, 00:19:35.859 "data_size": 63488 00:19:35.859 } 00:19:35.859 ] 00:19:35.859 }' 00:19:35.859 05:47:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.859 05:47:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.427 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:36.686 [2024-07-26 05:47:51.478570] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:36.686 [2024-07-26 05:47:51.478612] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:36.686 [2024-07-26 05:47:51.481773] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:36.686 [2024-07-26 05:47:51.481809] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:36.686 [2024-07-26 05:47:51.481908] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:36.686 [2024-07-26 05:47:51.481920] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14d2280 name raid_bdev1, state offline 00:19:36.686 0 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1186076 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1186076 ']' 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1186076 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1186076 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1186076' 00:19:36.687 killing process with pid 1186076 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1186076 00:19:36.687 [2024-07-26 05:47:51.547660] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:36.687 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1186076 00:19:36.687 [2024-07-26 05:47:51.568548] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Q460rvmy39 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:19:36.945 00:19:36.945 real 0m6.971s 00:19:36.945 user 0m11.083s 00:19:36.945 sys 0m1.190s 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:36.945 05:47:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.945 ************************************ 00:19:36.945 END TEST raid_read_error_test 00:19:36.945 ************************************ 00:19:37.204 05:47:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:37.204 05:47:51 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:19:37.204 05:47:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:37.204 05:47:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:37.204 05:47:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:37.204 ************************************ 00:19:37.204 START TEST raid_write_error_test 00:19:37.204 ************************************ 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.CnffKH956r 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1187057 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1187057 /var/tmp/spdk-raid.sock 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1187057 ']' 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:37.204 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:37.204 05:47:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.204 [2024-07-26 05:47:51.972478] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:19:37.204 [2024-07-26 05:47:51.972542] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1187057 ] 00:19:37.204 [2024-07-26 05:47:52.088021] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:37.463 [2024-07-26 05:47:52.191680] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:37.463 [2024-07-26 05:47:52.260728] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:37.463 [2024-07-26 05:47:52.260776] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:38.027 05:47:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:38.027 05:47:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:38.027 05:47:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:38.027 05:47:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:38.286 BaseBdev1_malloc 00:19:38.286 05:47:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:38.544 true 00:19:38.544 05:47:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:38.801 [2024-07-26 05:47:53.567730] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:38.801 [2024-07-26 05:47:53.567778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:38.801 [2024-07-26 05:47:53.567797] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf7b0d0 00:19:38.801 [2024-07-26 05:47:53.567809] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:38.801 [2024-07-26 05:47:53.569510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:38.801 [2024-07-26 05:47:53.569538] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:38.801 BaseBdev1 00:19:38.801 05:47:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:38.801 05:47:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:39.058 BaseBdev2_malloc 00:19:39.058 05:47:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:39.317 true 00:19:39.317 05:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:39.575 [2024-07-26 05:47:54.302224] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:39.575 [2024-07-26 05:47:54.302273] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:39.575 [2024-07-26 05:47:54.302293] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf7f910 00:19:39.575 [2024-07-26 05:47:54.302306] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:39.575 [2024-07-26 05:47:54.303745] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:39.575 [2024-07-26 05:47:54.303772] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:39.575 BaseBdev2 00:19:39.575 05:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:39.575 05:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:39.833 BaseBdev3_malloc 00:19:39.833 05:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:40.091 true 00:19:40.091 05:47:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:40.348 [2024-07-26 05:47:55.024675] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:40.348 [2024-07-26 05:47:55.024722] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:40.348 [2024-07-26 05:47:55.024742] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf81bd0 00:19:40.348 [2024-07-26 05:47:55.024754] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:40.348 [2024-07-26 05:47:55.026258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:40.348 [2024-07-26 05:47:55.026286] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:40.348 BaseBdev3 00:19:40.348 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:19:40.606 [2024-07-26 05:47:55.261326] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:40.606 [2024-07-26 05:47:55.262525] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:40.606 [2024-07-26 05:47:55.262603] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:40.606 [2024-07-26 05:47:55.262819] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xf83280 00:19:40.606 [2024-07-26 05:47:55.262831] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:40.606 [2024-07-26 05:47:55.263021] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf82e20 00:19:40.606 [2024-07-26 05:47:55.263170] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf83280 00:19:40.606 [2024-07-26 05:47:55.263180] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf83280 00:19:40.606 [2024-07-26 05:47:55.263277] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.606 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:40.863 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:40.863 "name": "raid_bdev1", 00:19:40.863 "uuid": "6158cf3c-b590-4e68-b0b2-4a30f5cbb841", 00:19:40.863 "strip_size_kb": 0, 00:19:40.863 "state": "online", 00:19:40.863 "raid_level": "raid1", 00:19:40.863 "superblock": true, 00:19:40.863 "num_base_bdevs": 3, 00:19:40.863 "num_base_bdevs_discovered": 3, 00:19:40.863 "num_base_bdevs_operational": 3, 00:19:40.863 "base_bdevs_list": [ 00:19:40.863 { 00:19:40.863 "name": "BaseBdev1", 00:19:40.863 "uuid": "9d7d202c-3c61-5ba9-a575-ea4134888824", 00:19:40.863 "is_configured": true, 00:19:40.863 "data_offset": 2048, 00:19:40.863 "data_size": 63488 00:19:40.863 }, 00:19:40.863 { 00:19:40.863 "name": "BaseBdev2", 00:19:40.863 "uuid": "0554c40f-a690-5b12-a901-42b2415a935e", 00:19:40.863 "is_configured": true, 00:19:40.863 "data_offset": 2048, 00:19:40.863 "data_size": 63488 00:19:40.863 }, 00:19:40.863 { 00:19:40.863 "name": "BaseBdev3", 00:19:40.863 "uuid": "42d23aed-11b1-5292-bd32-dc1d68cc88e5", 00:19:40.863 "is_configured": true, 00:19:40.863 "data_offset": 2048, 00:19:40.863 "data_size": 63488 00:19:40.863 } 00:19:40.863 ] 00:19:40.863 }' 00:19:40.863 05:47:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:40.863 05:47:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:41.430 05:47:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:41.430 05:47:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:41.430 [2024-07-26 05:47:56.204113] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd0e00 00:19:42.366 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:42.625 [2024-07-26 05:47:57.332688] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:19:42.625 [2024-07-26 05:47:57.332751] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:42.625 [2024-07-26 05:47:57.332953] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xdd0e00 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.625 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:42.884 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:42.884 "name": "raid_bdev1", 00:19:42.884 "uuid": "6158cf3c-b590-4e68-b0b2-4a30f5cbb841", 00:19:42.884 "strip_size_kb": 0, 00:19:42.884 "state": "online", 00:19:42.884 "raid_level": "raid1", 00:19:42.884 "superblock": true, 00:19:42.884 "num_base_bdevs": 3, 00:19:42.884 "num_base_bdevs_discovered": 2, 00:19:42.884 "num_base_bdevs_operational": 2, 00:19:42.884 "base_bdevs_list": [ 00:19:42.884 { 00:19:42.884 "name": null, 00:19:42.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:42.884 "is_configured": false, 00:19:42.884 "data_offset": 2048, 00:19:42.884 "data_size": 63488 00:19:42.884 }, 00:19:42.884 { 00:19:42.884 "name": "BaseBdev2", 00:19:42.884 "uuid": "0554c40f-a690-5b12-a901-42b2415a935e", 00:19:42.884 "is_configured": true, 00:19:42.884 "data_offset": 2048, 00:19:42.884 "data_size": 63488 00:19:42.884 }, 00:19:42.884 { 00:19:42.884 "name": "BaseBdev3", 00:19:42.884 "uuid": "42d23aed-11b1-5292-bd32-dc1d68cc88e5", 00:19:42.884 "is_configured": true, 00:19:42.884 "data_offset": 2048, 00:19:42.884 "data_size": 63488 00:19:42.884 } 00:19:42.884 ] 00:19:42.884 }' 00:19:42.884 05:47:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:42.884 05:47:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.451 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:43.451 [2024-07-26 05:47:58.355620] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:43.451 [2024-07-26 05:47:58.355666] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:43.451 [2024-07-26 05:47:58.358805] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:43.451 [2024-07-26 05:47:58.358837] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:43.451 [2024-07-26 05:47:58.358911] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:43.451 [2024-07-26 05:47:58.358922] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf83280 name raid_bdev1, state offline 00:19:43.710 0 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1187057 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1187057 ']' 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1187057 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1187057 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1187057' 00:19:43.710 killing process with pid 1187057 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1187057 00:19:43.710 [2024-07-26 05:47:58.423497] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:43.710 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1187057 00:19:43.710 [2024-07-26 05:47:58.444912] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.CnffKH956r 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:19:43.970 00:19:43.970 real 0m6.786s 00:19:43.970 user 0m10.693s 00:19:43.970 sys 0m1.209s 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:43.970 05:47:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.970 ************************************ 00:19:43.970 END TEST raid_write_error_test 00:19:43.970 ************************************ 00:19:43.970 05:47:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:43.970 05:47:58 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:19:43.970 05:47:58 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:43.970 05:47:58 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:19:43.970 05:47:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:43.970 05:47:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:43.970 05:47:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:43.970 ************************************ 00:19:43.970 START TEST raid_state_function_test 00:19:43.970 ************************************ 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1188035 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1188035' 00:19:43.970 Process raid pid: 1188035 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1188035 /var/tmp/spdk-raid.sock 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1188035 ']' 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:43.970 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:43.970 05:47:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.970 [2024-07-26 05:47:58.839980] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:19:43.970 [2024-07-26 05:47:58.840046] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:44.229 [2024-07-26 05:47:58.965783] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:44.229 [2024-07-26 05:47:59.073485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:44.505 [2024-07-26 05:47:59.143592] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:44.505 [2024-07-26 05:47:59.143623] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:45.089 05:47:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:45.089 05:47:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:45.089 05:47:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:45.347 [2024-07-26 05:48:00.015077] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:45.347 [2024-07-26 05:48:00.015116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:45.347 [2024-07-26 05:48:00.015128] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:45.347 [2024-07-26 05:48:00.015139] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:45.347 [2024-07-26 05:48:00.015148] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:45.347 [2024-07-26 05:48:00.015160] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:45.347 [2024-07-26 05:48:00.015168] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:45.347 [2024-07-26 05:48:00.015180] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.347 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:45.605 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:45.605 "name": "Existed_Raid", 00:19:45.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.605 "strip_size_kb": 64, 00:19:45.605 "state": "configuring", 00:19:45.605 "raid_level": "raid0", 00:19:45.605 "superblock": false, 00:19:45.605 "num_base_bdevs": 4, 00:19:45.605 "num_base_bdevs_discovered": 0, 00:19:45.605 "num_base_bdevs_operational": 4, 00:19:45.605 "base_bdevs_list": [ 00:19:45.605 { 00:19:45.605 "name": "BaseBdev1", 00:19:45.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.605 "is_configured": false, 00:19:45.605 "data_offset": 0, 00:19:45.605 "data_size": 0 00:19:45.605 }, 00:19:45.605 { 00:19:45.605 "name": "BaseBdev2", 00:19:45.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.606 "is_configured": false, 00:19:45.606 "data_offset": 0, 00:19:45.606 "data_size": 0 00:19:45.606 }, 00:19:45.606 { 00:19:45.606 "name": "BaseBdev3", 00:19:45.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.606 "is_configured": false, 00:19:45.606 "data_offset": 0, 00:19:45.606 "data_size": 0 00:19:45.606 }, 00:19:45.606 { 00:19:45.606 "name": "BaseBdev4", 00:19:45.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:45.606 "is_configured": false, 00:19:45.606 "data_offset": 0, 00:19:45.606 "data_size": 0 00:19:45.606 } 00:19:45.606 ] 00:19:45.606 }' 00:19:45.606 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:45.606 05:48:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:46.172 05:48:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:46.172 [2024-07-26 05:48:01.053686] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:46.172 [2024-07-26 05:48:01.053718] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf5aa0 name Existed_Raid, state configuring 00:19:46.172 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:46.430 [2024-07-26 05:48:01.238196] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:46.430 [2024-07-26 05:48:01.238225] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:46.430 [2024-07-26 05:48:01.238234] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:46.430 [2024-07-26 05:48:01.238245] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:46.430 [2024-07-26 05:48:01.238254] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:46.430 [2024-07-26 05:48:01.238266] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:46.430 [2024-07-26 05:48:01.238274] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:46.430 [2024-07-26 05:48:01.238285] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:46.430 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:46.688 [2024-07-26 05:48:01.428459] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:46.688 BaseBdev1 00:19:46.688 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:46.688 05:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:46.688 05:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:46.688 05:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:46.688 05:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:46.688 05:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:46.688 05:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:46.946 05:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:47.204 [ 00:19:47.204 { 00:19:47.204 "name": "BaseBdev1", 00:19:47.204 "aliases": [ 00:19:47.204 "8cd37e3b-10be-464e-8683-fb477550478c" 00:19:47.204 ], 00:19:47.204 "product_name": "Malloc disk", 00:19:47.204 "block_size": 512, 00:19:47.204 "num_blocks": 65536, 00:19:47.204 "uuid": "8cd37e3b-10be-464e-8683-fb477550478c", 00:19:47.204 "assigned_rate_limits": { 00:19:47.204 "rw_ios_per_sec": 0, 00:19:47.204 "rw_mbytes_per_sec": 0, 00:19:47.204 "r_mbytes_per_sec": 0, 00:19:47.204 "w_mbytes_per_sec": 0 00:19:47.204 }, 00:19:47.204 "claimed": true, 00:19:47.204 "claim_type": "exclusive_write", 00:19:47.204 "zoned": false, 00:19:47.204 "supported_io_types": { 00:19:47.204 "read": true, 00:19:47.204 "write": true, 00:19:47.204 "unmap": true, 00:19:47.204 "flush": true, 00:19:47.204 "reset": true, 00:19:47.204 "nvme_admin": false, 00:19:47.204 "nvme_io": false, 00:19:47.204 "nvme_io_md": false, 00:19:47.204 "write_zeroes": true, 00:19:47.204 "zcopy": true, 00:19:47.204 "get_zone_info": false, 00:19:47.204 "zone_management": false, 00:19:47.204 "zone_append": false, 00:19:47.204 "compare": false, 00:19:47.204 "compare_and_write": false, 00:19:47.204 "abort": true, 00:19:47.204 "seek_hole": false, 00:19:47.204 "seek_data": false, 00:19:47.204 "copy": true, 00:19:47.204 "nvme_iov_md": false 00:19:47.204 }, 00:19:47.204 "memory_domains": [ 00:19:47.204 { 00:19:47.204 "dma_device_id": "system", 00:19:47.204 "dma_device_type": 1 00:19:47.204 }, 00:19:47.204 { 00:19:47.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.204 "dma_device_type": 2 00:19:47.204 } 00:19:47.204 ], 00:19:47.204 "driver_specific": {} 00:19:47.204 } 00:19:47.204 ] 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.204 05:48:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:47.462 05:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.462 "name": "Existed_Raid", 00:19:47.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.462 "strip_size_kb": 64, 00:19:47.462 "state": "configuring", 00:19:47.462 "raid_level": "raid0", 00:19:47.462 "superblock": false, 00:19:47.462 "num_base_bdevs": 4, 00:19:47.463 "num_base_bdevs_discovered": 1, 00:19:47.463 "num_base_bdevs_operational": 4, 00:19:47.463 "base_bdevs_list": [ 00:19:47.463 { 00:19:47.463 "name": "BaseBdev1", 00:19:47.463 "uuid": "8cd37e3b-10be-464e-8683-fb477550478c", 00:19:47.463 "is_configured": true, 00:19:47.463 "data_offset": 0, 00:19:47.463 "data_size": 65536 00:19:47.463 }, 00:19:47.463 { 00:19:47.463 "name": "BaseBdev2", 00:19:47.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.463 "is_configured": false, 00:19:47.463 "data_offset": 0, 00:19:47.463 "data_size": 0 00:19:47.463 }, 00:19:47.463 { 00:19:47.463 "name": "BaseBdev3", 00:19:47.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.463 "is_configured": false, 00:19:47.463 "data_offset": 0, 00:19:47.463 "data_size": 0 00:19:47.463 }, 00:19:47.463 { 00:19:47.463 "name": "BaseBdev4", 00:19:47.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.463 "is_configured": false, 00:19:47.463 "data_offset": 0, 00:19:47.463 "data_size": 0 00:19:47.463 } 00:19:47.463 ] 00:19:47.463 }' 00:19:47.463 05:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.463 05:48:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:48.030 05:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:48.289 [2024-07-26 05:48:02.960526] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:48.289 [2024-07-26 05:48:02.960563] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf5310 name Existed_Raid, state configuring 00:19:48.289 05:48:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:48.548 [2024-07-26 05:48:03.205208] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:48.548 [2024-07-26 05:48:03.206651] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:48.548 [2024-07-26 05:48:03.206683] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:48.548 [2024-07-26 05:48:03.206694] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:48.548 [2024-07-26 05:48:03.206705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:48.548 [2024-07-26 05:48:03.206714] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:48.548 [2024-07-26 05:48:03.206725] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.548 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:48.807 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.807 "name": "Existed_Raid", 00:19:48.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.807 "strip_size_kb": 64, 00:19:48.807 "state": "configuring", 00:19:48.807 "raid_level": "raid0", 00:19:48.807 "superblock": false, 00:19:48.807 "num_base_bdevs": 4, 00:19:48.807 "num_base_bdevs_discovered": 1, 00:19:48.807 "num_base_bdevs_operational": 4, 00:19:48.807 "base_bdevs_list": [ 00:19:48.807 { 00:19:48.807 "name": "BaseBdev1", 00:19:48.807 "uuid": "8cd37e3b-10be-464e-8683-fb477550478c", 00:19:48.807 "is_configured": true, 00:19:48.807 "data_offset": 0, 00:19:48.807 "data_size": 65536 00:19:48.807 }, 00:19:48.807 { 00:19:48.807 "name": "BaseBdev2", 00:19:48.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.807 "is_configured": false, 00:19:48.807 "data_offset": 0, 00:19:48.807 "data_size": 0 00:19:48.807 }, 00:19:48.807 { 00:19:48.807 "name": "BaseBdev3", 00:19:48.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.807 "is_configured": false, 00:19:48.807 "data_offset": 0, 00:19:48.807 "data_size": 0 00:19:48.807 }, 00:19:48.807 { 00:19:48.807 "name": "BaseBdev4", 00:19:48.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.807 "is_configured": false, 00:19:48.807 "data_offset": 0, 00:19:48.807 "data_size": 0 00:19:48.807 } 00:19:48.807 ] 00:19:48.807 }' 00:19:48.807 05:48:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.807 05:48:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.375 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:49.375 [2024-07-26 05:48:04.215311] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:49.375 BaseBdev2 00:19:49.375 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:49.375 05:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:49.375 05:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:49.375 05:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:49.375 05:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:49.375 05:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:49.375 05:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:49.634 05:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:49.893 [ 00:19:49.893 { 00:19:49.893 "name": "BaseBdev2", 00:19:49.893 "aliases": [ 00:19:49.893 "257e812a-2808-4fdc-9868-7c0c102d8540" 00:19:49.893 ], 00:19:49.893 "product_name": "Malloc disk", 00:19:49.893 "block_size": 512, 00:19:49.893 "num_blocks": 65536, 00:19:49.893 "uuid": "257e812a-2808-4fdc-9868-7c0c102d8540", 00:19:49.893 "assigned_rate_limits": { 00:19:49.893 "rw_ios_per_sec": 0, 00:19:49.893 "rw_mbytes_per_sec": 0, 00:19:49.893 "r_mbytes_per_sec": 0, 00:19:49.893 "w_mbytes_per_sec": 0 00:19:49.893 }, 00:19:49.893 "claimed": true, 00:19:49.893 "claim_type": "exclusive_write", 00:19:49.893 "zoned": false, 00:19:49.893 "supported_io_types": { 00:19:49.893 "read": true, 00:19:49.893 "write": true, 00:19:49.893 "unmap": true, 00:19:49.893 "flush": true, 00:19:49.893 "reset": true, 00:19:49.893 "nvme_admin": false, 00:19:49.893 "nvme_io": false, 00:19:49.893 "nvme_io_md": false, 00:19:49.893 "write_zeroes": true, 00:19:49.893 "zcopy": true, 00:19:49.893 "get_zone_info": false, 00:19:49.893 "zone_management": false, 00:19:49.893 "zone_append": false, 00:19:49.893 "compare": false, 00:19:49.893 "compare_and_write": false, 00:19:49.893 "abort": true, 00:19:49.893 "seek_hole": false, 00:19:49.893 "seek_data": false, 00:19:49.893 "copy": true, 00:19:49.893 "nvme_iov_md": false 00:19:49.893 }, 00:19:49.893 "memory_domains": [ 00:19:49.893 { 00:19:49.893 "dma_device_id": "system", 00:19:49.893 "dma_device_type": 1 00:19:49.893 }, 00:19:49.893 { 00:19:49.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.893 "dma_device_type": 2 00:19:49.893 } 00:19:49.893 ], 00:19:49.893 "driver_specific": {} 00:19:49.893 } 00:19:49.893 ] 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.893 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:50.152 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.152 "name": "Existed_Raid", 00:19:50.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.152 "strip_size_kb": 64, 00:19:50.152 "state": "configuring", 00:19:50.152 "raid_level": "raid0", 00:19:50.152 "superblock": false, 00:19:50.152 "num_base_bdevs": 4, 00:19:50.152 "num_base_bdevs_discovered": 2, 00:19:50.152 "num_base_bdevs_operational": 4, 00:19:50.152 "base_bdevs_list": [ 00:19:50.152 { 00:19:50.152 "name": "BaseBdev1", 00:19:50.152 "uuid": "8cd37e3b-10be-464e-8683-fb477550478c", 00:19:50.152 "is_configured": true, 00:19:50.152 "data_offset": 0, 00:19:50.152 "data_size": 65536 00:19:50.152 }, 00:19:50.152 { 00:19:50.152 "name": "BaseBdev2", 00:19:50.152 "uuid": "257e812a-2808-4fdc-9868-7c0c102d8540", 00:19:50.152 "is_configured": true, 00:19:50.152 "data_offset": 0, 00:19:50.152 "data_size": 65536 00:19:50.152 }, 00:19:50.152 { 00:19:50.152 "name": "BaseBdev3", 00:19:50.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.152 "is_configured": false, 00:19:50.152 "data_offset": 0, 00:19:50.152 "data_size": 0 00:19:50.152 }, 00:19:50.152 { 00:19:50.152 "name": "BaseBdev4", 00:19:50.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.152 "is_configured": false, 00:19:50.152 "data_offset": 0, 00:19:50.152 "data_size": 0 00:19:50.152 } 00:19:50.152 ] 00:19:50.152 }' 00:19:50.152 05:48:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.152 05:48:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.720 05:48:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:50.980 [2024-07-26 05:48:05.750753] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:50.980 BaseBdev3 00:19:50.980 05:48:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:50.980 05:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:50.980 05:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:50.980 05:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:50.980 05:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:50.980 05:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:50.980 05:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:51.239 05:48:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:51.498 [ 00:19:51.498 { 00:19:51.498 "name": "BaseBdev3", 00:19:51.498 "aliases": [ 00:19:51.498 "32d29f07-8b5a-4bb6-a1cc-da2932e3dc19" 00:19:51.498 ], 00:19:51.498 "product_name": "Malloc disk", 00:19:51.498 "block_size": 512, 00:19:51.498 "num_blocks": 65536, 00:19:51.498 "uuid": "32d29f07-8b5a-4bb6-a1cc-da2932e3dc19", 00:19:51.498 "assigned_rate_limits": { 00:19:51.498 "rw_ios_per_sec": 0, 00:19:51.498 "rw_mbytes_per_sec": 0, 00:19:51.498 "r_mbytes_per_sec": 0, 00:19:51.498 "w_mbytes_per_sec": 0 00:19:51.498 }, 00:19:51.498 "claimed": true, 00:19:51.498 "claim_type": "exclusive_write", 00:19:51.498 "zoned": false, 00:19:51.498 "supported_io_types": { 00:19:51.498 "read": true, 00:19:51.498 "write": true, 00:19:51.498 "unmap": true, 00:19:51.498 "flush": true, 00:19:51.498 "reset": true, 00:19:51.498 "nvme_admin": false, 00:19:51.498 "nvme_io": false, 00:19:51.498 "nvme_io_md": false, 00:19:51.498 "write_zeroes": true, 00:19:51.498 "zcopy": true, 00:19:51.498 "get_zone_info": false, 00:19:51.498 "zone_management": false, 00:19:51.498 "zone_append": false, 00:19:51.498 "compare": false, 00:19:51.498 "compare_and_write": false, 00:19:51.498 "abort": true, 00:19:51.498 "seek_hole": false, 00:19:51.498 "seek_data": false, 00:19:51.498 "copy": true, 00:19:51.498 "nvme_iov_md": false 00:19:51.498 }, 00:19:51.498 "memory_domains": [ 00:19:51.498 { 00:19:51.498 "dma_device_id": "system", 00:19:51.498 "dma_device_type": 1 00:19:51.498 }, 00:19:51.498 { 00:19:51.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.498 "dma_device_type": 2 00:19:51.498 } 00:19:51.498 ], 00:19:51.498 "driver_specific": {} 00:19:51.498 } 00:19:51.498 ] 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.498 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:51.757 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.757 "name": "Existed_Raid", 00:19:51.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.757 "strip_size_kb": 64, 00:19:51.757 "state": "configuring", 00:19:51.757 "raid_level": "raid0", 00:19:51.757 "superblock": false, 00:19:51.757 "num_base_bdevs": 4, 00:19:51.757 "num_base_bdevs_discovered": 3, 00:19:51.757 "num_base_bdevs_operational": 4, 00:19:51.757 "base_bdevs_list": [ 00:19:51.757 { 00:19:51.757 "name": "BaseBdev1", 00:19:51.757 "uuid": "8cd37e3b-10be-464e-8683-fb477550478c", 00:19:51.757 "is_configured": true, 00:19:51.757 "data_offset": 0, 00:19:51.757 "data_size": 65536 00:19:51.757 }, 00:19:51.757 { 00:19:51.757 "name": "BaseBdev2", 00:19:51.757 "uuid": "257e812a-2808-4fdc-9868-7c0c102d8540", 00:19:51.757 "is_configured": true, 00:19:51.757 "data_offset": 0, 00:19:51.757 "data_size": 65536 00:19:51.757 }, 00:19:51.757 { 00:19:51.757 "name": "BaseBdev3", 00:19:51.757 "uuid": "32d29f07-8b5a-4bb6-a1cc-da2932e3dc19", 00:19:51.757 "is_configured": true, 00:19:51.757 "data_offset": 0, 00:19:51.757 "data_size": 65536 00:19:51.757 }, 00:19:51.757 { 00:19:51.757 "name": "BaseBdev4", 00:19:51.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.757 "is_configured": false, 00:19:51.757 "data_offset": 0, 00:19:51.757 "data_size": 0 00:19:51.757 } 00:19:51.757 ] 00:19:51.757 }' 00:19:51.757 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.757 05:48:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.325 05:48:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:52.325 [2024-07-26 05:48:07.161951] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:52.325 [2024-07-26 05:48:07.161986] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cf6350 00:19:52.325 [2024-07-26 05:48:07.161995] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:52.325 [2024-07-26 05:48:07.162242] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf6020 00:19:52.325 [2024-07-26 05:48:07.162362] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cf6350 00:19:52.325 [2024-07-26 05:48:07.162377] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cf6350 00:19:52.325 [2024-07-26 05:48:07.162542] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:52.325 BaseBdev4 00:19:52.325 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:52.325 05:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:52.325 05:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:52.325 05:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:52.325 05:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:52.325 05:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:52.325 05:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:52.585 05:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:52.844 [ 00:19:52.844 { 00:19:52.844 "name": "BaseBdev4", 00:19:52.844 "aliases": [ 00:19:52.844 "b40e3f3e-a5e9-4133-a634-07225ea46d66" 00:19:52.844 ], 00:19:52.844 "product_name": "Malloc disk", 00:19:52.844 "block_size": 512, 00:19:52.844 "num_blocks": 65536, 00:19:52.844 "uuid": "b40e3f3e-a5e9-4133-a634-07225ea46d66", 00:19:52.844 "assigned_rate_limits": { 00:19:52.844 "rw_ios_per_sec": 0, 00:19:52.844 "rw_mbytes_per_sec": 0, 00:19:52.844 "r_mbytes_per_sec": 0, 00:19:52.844 "w_mbytes_per_sec": 0 00:19:52.844 }, 00:19:52.844 "claimed": true, 00:19:52.844 "claim_type": "exclusive_write", 00:19:52.844 "zoned": false, 00:19:52.844 "supported_io_types": { 00:19:52.844 "read": true, 00:19:52.844 "write": true, 00:19:52.844 "unmap": true, 00:19:52.844 "flush": true, 00:19:52.844 "reset": true, 00:19:52.844 "nvme_admin": false, 00:19:52.844 "nvme_io": false, 00:19:52.844 "nvme_io_md": false, 00:19:52.844 "write_zeroes": true, 00:19:52.844 "zcopy": true, 00:19:52.844 "get_zone_info": false, 00:19:52.844 "zone_management": false, 00:19:52.844 "zone_append": false, 00:19:52.844 "compare": false, 00:19:52.844 "compare_and_write": false, 00:19:52.844 "abort": true, 00:19:52.844 "seek_hole": false, 00:19:52.844 "seek_data": false, 00:19:52.844 "copy": true, 00:19:52.844 "nvme_iov_md": false 00:19:52.844 }, 00:19:52.844 "memory_domains": [ 00:19:52.844 { 00:19:52.844 "dma_device_id": "system", 00:19:52.844 "dma_device_type": 1 00:19:52.844 }, 00:19:52.844 { 00:19:52.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:52.844 "dma_device_type": 2 00:19:52.844 } 00:19:52.844 ], 00:19:52.844 "driver_specific": {} 00:19:52.844 } 00:19:52.844 ] 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.844 "name": "Existed_Raid", 00:19:52.844 "uuid": "4342f879-0ae3-4f98-afeb-2589e264af94", 00:19:52.844 "strip_size_kb": 64, 00:19:52.844 "state": "online", 00:19:52.844 "raid_level": "raid0", 00:19:52.844 "superblock": false, 00:19:52.844 "num_base_bdevs": 4, 00:19:52.844 "num_base_bdevs_discovered": 4, 00:19:52.844 "num_base_bdevs_operational": 4, 00:19:52.844 "base_bdevs_list": [ 00:19:52.844 { 00:19:52.844 "name": "BaseBdev1", 00:19:52.844 "uuid": "8cd37e3b-10be-464e-8683-fb477550478c", 00:19:52.844 "is_configured": true, 00:19:52.844 "data_offset": 0, 00:19:52.844 "data_size": 65536 00:19:52.844 }, 00:19:52.844 { 00:19:52.844 "name": "BaseBdev2", 00:19:52.844 "uuid": "257e812a-2808-4fdc-9868-7c0c102d8540", 00:19:52.844 "is_configured": true, 00:19:52.844 "data_offset": 0, 00:19:52.844 "data_size": 65536 00:19:52.844 }, 00:19:52.844 { 00:19:52.844 "name": "BaseBdev3", 00:19:52.844 "uuid": "32d29f07-8b5a-4bb6-a1cc-da2932e3dc19", 00:19:52.844 "is_configured": true, 00:19:52.844 "data_offset": 0, 00:19:52.844 "data_size": 65536 00:19:52.844 }, 00:19:52.844 { 00:19:52.844 "name": "BaseBdev4", 00:19:52.844 "uuid": "b40e3f3e-a5e9-4133-a634-07225ea46d66", 00:19:52.844 "is_configured": true, 00:19:52.844 "data_offset": 0, 00:19:52.844 "data_size": 65536 00:19:52.844 } 00:19:52.844 ] 00:19:52.844 }' 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.844 05:48:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.411 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:53.411 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:53.411 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:53.411 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:53.411 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:53.411 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:53.411 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:53.411 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:53.671 [2024-07-26 05:48:08.457728] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:53.671 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:53.671 "name": "Existed_Raid", 00:19:53.671 "aliases": [ 00:19:53.671 "4342f879-0ae3-4f98-afeb-2589e264af94" 00:19:53.671 ], 00:19:53.671 "product_name": "Raid Volume", 00:19:53.671 "block_size": 512, 00:19:53.671 "num_blocks": 262144, 00:19:53.671 "uuid": "4342f879-0ae3-4f98-afeb-2589e264af94", 00:19:53.671 "assigned_rate_limits": { 00:19:53.671 "rw_ios_per_sec": 0, 00:19:53.671 "rw_mbytes_per_sec": 0, 00:19:53.671 "r_mbytes_per_sec": 0, 00:19:53.671 "w_mbytes_per_sec": 0 00:19:53.671 }, 00:19:53.671 "claimed": false, 00:19:53.671 "zoned": false, 00:19:53.671 "supported_io_types": { 00:19:53.671 "read": true, 00:19:53.671 "write": true, 00:19:53.671 "unmap": true, 00:19:53.671 "flush": true, 00:19:53.671 "reset": true, 00:19:53.671 "nvme_admin": false, 00:19:53.671 "nvme_io": false, 00:19:53.671 "nvme_io_md": false, 00:19:53.671 "write_zeroes": true, 00:19:53.671 "zcopy": false, 00:19:53.671 "get_zone_info": false, 00:19:53.671 "zone_management": false, 00:19:53.671 "zone_append": false, 00:19:53.671 "compare": false, 00:19:53.671 "compare_and_write": false, 00:19:53.671 "abort": false, 00:19:53.671 "seek_hole": false, 00:19:53.671 "seek_data": false, 00:19:53.671 "copy": false, 00:19:53.671 "nvme_iov_md": false 00:19:53.671 }, 00:19:53.671 "memory_domains": [ 00:19:53.671 { 00:19:53.671 "dma_device_id": "system", 00:19:53.671 "dma_device_type": 1 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.671 "dma_device_type": 2 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "dma_device_id": "system", 00:19:53.671 "dma_device_type": 1 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.671 "dma_device_type": 2 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "dma_device_id": "system", 00:19:53.671 "dma_device_type": 1 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.671 "dma_device_type": 2 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "dma_device_id": "system", 00:19:53.671 "dma_device_type": 1 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.671 "dma_device_type": 2 00:19:53.671 } 00:19:53.671 ], 00:19:53.671 "driver_specific": { 00:19:53.671 "raid": { 00:19:53.671 "uuid": "4342f879-0ae3-4f98-afeb-2589e264af94", 00:19:53.671 "strip_size_kb": 64, 00:19:53.671 "state": "online", 00:19:53.671 "raid_level": "raid0", 00:19:53.671 "superblock": false, 00:19:53.671 "num_base_bdevs": 4, 00:19:53.671 "num_base_bdevs_discovered": 4, 00:19:53.671 "num_base_bdevs_operational": 4, 00:19:53.671 "base_bdevs_list": [ 00:19:53.671 { 00:19:53.671 "name": "BaseBdev1", 00:19:53.671 "uuid": "8cd37e3b-10be-464e-8683-fb477550478c", 00:19:53.671 "is_configured": true, 00:19:53.671 "data_offset": 0, 00:19:53.671 "data_size": 65536 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "name": "BaseBdev2", 00:19:53.671 "uuid": "257e812a-2808-4fdc-9868-7c0c102d8540", 00:19:53.671 "is_configured": true, 00:19:53.671 "data_offset": 0, 00:19:53.671 "data_size": 65536 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "name": "BaseBdev3", 00:19:53.671 "uuid": "32d29f07-8b5a-4bb6-a1cc-da2932e3dc19", 00:19:53.671 "is_configured": true, 00:19:53.671 "data_offset": 0, 00:19:53.671 "data_size": 65536 00:19:53.671 }, 00:19:53.671 { 00:19:53.671 "name": "BaseBdev4", 00:19:53.671 "uuid": "b40e3f3e-a5e9-4133-a634-07225ea46d66", 00:19:53.671 "is_configured": true, 00:19:53.671 "data_offset": 0, 00:19:53.671 "data_size": 65536 00:19:53.671 } 00:19:53.671 ] 00:19:53.671 } 00:19:53.671 } 00:19:53.671 }' 00:19:53.671 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:53.671 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:53.671 BaseBdev2 00:19:53.671 BaseBdev3 00:19:53.671 BaseBdev4' 00:19:53.671 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:53.671 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:53.671 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:53.930 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:53.930 "name": "BaseBdev1", 00:19:53.930 "aliases": [ 00:19:53.930 "8cd37e3b-10be-464e-8683-fb477550478c" 00:19:53.930 ], 00:19:53.930 "product_name": "Malloc disk", 00:19:53.930 "block_size": 512, 00:19:53.930 "num_blocks": 65536, 00:19:53.930 "uuid": "8cd37e3b-10be-464e-8683-fb477550478c", 00:19:53.930 "assigned_rate_limits": { 00:19:53.930 "rw_ios_per_sec": 0, 00:19:53.930 "rw_mbytes_per_sec": 0, 00:19:53.930 "r_mbytes_per_sec": 0, 00:19:53.930 "w_mbytes_per_sec": 0 00:19:53.930 }, 00:19:53.930 "claimed": true, 00:19:53.930 "claim_type": "exclusive_write", 00:19:53.930 "zoned": false, 00:19:53.930 "supported_io_types": { 00:19:53.930 "read": true, 00:19:53.930 "write": true, 00:19:53.930 "unmap": true, 00:19:53.930 "flush": true, 00:19:53.930 "reset": true, 00:19:53.930 "nvme_admin": false, 00:19:53.930 "nvme_io": false, 00:19:53.930 "nvme_io_md": false, 00:19:53.930 "write_zeroes": true, 00:19:53.930 "zcopy": true, 00:19:53.930 "get_zone_info": false, 00:19:53.930 "zone_management": false, 00:19:53.930 "zone_append": false, 00:19:53.930 "compare": false, 00:19:53.930 "compare_and_write": false, 00:19:53.930 "abort": true, 00:19:53.930 "seek_hole": false, 00:19:53.930 "seek_data": false, 00:19:53.930 "copy": true, 00:19:53.930 "nvme_iov_md": false 00:19:53.930 }, 00:19:53.930 "memory_domains": [ 00:19:53.930 { 00:19:53.930 "dma_device_id": "system", 00:19:53.930 "dma_device_type": 1 00:19:53.930 }, 00:19:53.930 { 00:19:53.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.930 "dma_device_type": 2 00:19:53.930 } 00:19:53.930 ], 00:19:53.930 "driver_specific": {} 00:19:53.930 }' 00:19:53.930 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.930 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:53.930 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:53.930 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.189 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.189 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:54.189 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.189 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.189 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:54.189 05:48:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.189 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.189 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:54.189 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:54.189 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:54.189 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:54.448 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:54.448 "name": "BaseBdev2", 00:19:54.448 "aliases": [ 00:19:54.448 "257e812a-2808-4fdc-9868-7c0c102d8540" 00:19:54.448 ], 00:19:54.448 "product_name": "Malloc disk", 00:19:54.448 "block_size": 512, 00:19:54.448 "num_blocks": 65536, 00:19:54.448 "uuid": "257e812a-2808-4fdc-9868-7c0c102d8540", 00:19:54.448 "assigned_rate_limits": { 00:19:54.448 "rw_ios_per_sec": 0, 00:19:54.448 "rw_mbytes_per_sec": 0, 00:19:54.448 "r_mbytes_per_sec": 0, 00:19:54.448 "w_mbytes_per_sec": 0 00:19:54.448 }, 00:19:54.448 "claimed": true, 00:19:54.448 "claim_type": "exclusive_write", 00:19:54.448 "zoned": false, 00:19:54.448 "supported_io_types": { 00:19:54.448 "read": true, 00:19:54.448 "write": true, 00:19:54.448 "unmap": true, 00:19:54.448 "flush": true, 00:19:54.448 "reset": true, 00:19:54.448 "nvme_admin": false, 00:19:54.448 "nvme_io": false, 00:19:54.448 "nvme_io_md": false, 00:19:54.448 "write_zeroes": true, 00:19:54.448 "zcopy": true, 00:19:54.448 "get_zone_info": false, 00:19:54.448 "zone_management": false, 00:19:54.448 "zone_append": false, 00:19:54.448 "compare": false, 00:19:54.448 "compare_and_write": false, 00:19:54.448 "abort": true, 00:19:54.448 "seek_hole": false, 00:19:54.448 "seek_data": false, 00:19:54.448 "copy": true, 00:19:54.448 "nvme_iov_md": false 00:19:54.448 }, 00:19:54.448 "memory_domains": [ 00:19:54.448 { 00:19:54.448 "dma_device_id": "system", 00:19:54.448 "dma_device_type": 1 00:19:54.448 }, 00:19:54.448 { 00:19:54.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.448 "dma_device_type": 2 00:19:54.448 } 00:19:54.448 ], 00:19:54.448 "driver_specific": {} 00:19:54.448 }' 00:19:54.448 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.706 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.706 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:54.706 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.706 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.706 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:54.706 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.706 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.706 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:54.706 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.965 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.965 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:54.965 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:54.965 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:54.965 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.224 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.224 "name": "BaseBdev3", 00:19:55.224 "aliases": [ 00:19:55.224 "32d29f07-8b5a-4bb6-a1cc-da2932e3dc19" 00:19:55.224 ], 00:19:55.224 "product_name": "Malloc disk", 00:19:55.224 "block_size": 512, 00:19:55.224 "num_blocks": 65536, 00:19:55.224 "uuid": "32d29f07-8b5a-4bb6-a1cc-da2932e3dc19", 00:19:55.224 "assigned_rate_limits": { 00:19:55.224 "rw_ios_per_sec": 0, 00:19:55.224 "rw_mbytes_per_sec": 0, 00:19:55.224 "r_mbytes_per_sec": 0, 00:19:55.224 "w_mbytes_per_sec": 0 00:19:55.224 }, 00:19:55.224 "claimed": true, 00:19:55.224 "claim_type": "exclusive_write", 00:19:55.224 "zoned": false, 00:19:55.224 "supported_io_types": { 00:19:55.224 "read": true, 00:19:55.224 "write": true, 00:19:55.224 "unmap": true, 00:19:55.224 "flush": true, 00:19:55.224 "reset": true, 00:19:55.224 "nvme_admin": false, 00:19:55.224 "nvme_io": false, 00:19:55.224 "nvme_io_md": false, 00:19:55.224 "write_zeroes": true, 00:19:55.224 "zcopy": true, 00:19:55.224 "get_zone_info": false, 00:19:55.224 "zone_management": false, 00:19:55.224 "zone_append": false, 00:19:55.224 "compare": false, 00:19:55.224 "compare_and_write": false, 00:19:55.224 "abort": true, 00:19:55.224 "seek_hole": false, 00:19:55.224 "seek_data": false, 00:19:55.224 "copy": true, 00:19:55.224 "nvme_iov_md": false 00:19:55.224 }, 00:19:55.224 "memory_domains": [ 00:19:55.224 { 00:19:55.224 "dma_device_id": "system", 00:19:55.224 "dma_device_type": 1 00:19:55.224 }, 00:19:55.224 { 00:19:55.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.224 "dma_device_type": 2 00:19:55.224 } 00:19:55.224 ], 00:19:55.224 "driver_specific": {} 00:19:55.224 }' 00:19:55.224 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.224 05:48:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.224 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:55.224 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.224 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.224 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:55.224 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.483 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.483 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:55.483 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.483 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.483 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.483 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.483 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:55.483 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.741 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.741 "name": "BaseBdev4", 00:19:55.741 "aliases": [ 00:19:55.741 "b40e3f3e-a5e9-4133-a634-07225ea46d66" 00:19:55.741 ], 00:19:55.741 "product_name": "Malloc disk", 00:19:55.741 "block_size": 512, 00:19:55.741 "num_blocks": 65536, 00:19:55.741 "uuid": "b40e3f3e-a5e9-4133-a634-07225ea46d66", 00:19:55.741 "assigned_rate_limits": { 00:19:55.741 "rw_ios_per_sec": 0, 00:19:55.741 "rw_mbytes_per_sec": 0, 00:19:55.741 "r_mbytes_per_sec": 0, 00:19:55.741 "w_mbytes_per_sec": 0 00:19:55.741 }, 00:19:55.741 "claimed": true, 00:19:55.741 "claim_type": "exclusive_write", 00:19:55.741 "zoned": false, 00:19:55.741 "supported_io_types": { 00:19:55.741 "read": true, 00:19:55.741 "write": true, 00:19:55.741 "unmap": true, 00:19:55.741 "flush": true, 00:19:55.741 "reset": true, 00:19:55.741 "nvme_admin": false, 00:19:55.741 "nvme_io": false, 00:19:55.741 "nvme_io_md": false, 00:19:55.741 "write_zeroes": true, 00:19:55.741 "zcopy": true, 00:19:55.741 "get_zone_info": false, 00:19:55.741 "zone_management": false, 00:19:55.741 "zone_append": false, 00:19:55.741 "compare": false, 00:19:55.741 "compare_and_write": false, 00:19:55.741 "abort": true, 00:19:55.741 "seek_hole": false, 00:19:55.741 "seek_data": false, 00:19:55.741 "copy": true, 00:19:55.741 "nvme_iov_md": false 00:19:55.741 }, 00:19:55.741 "memory_domains": [ 00:19:55.741 { 00:19:55.741 "dma_device_id": "system", 00:19:55.741 "dma_device_type": 1 00:19:55.741 }, 00:19:55.741 { 00:19:55.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.741 "dma_device_type": 2 00:19:55.741 } 00:19:55.741 ], 00:19:55.741 "driver_specific": {} 00:19:55.741 }' 00:19:55.741 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.741 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.741 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:55.741 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.741 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.999 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:55.999 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.999 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.999 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:55.999 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.999 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.999 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.999 05:48:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:56.257 [2024-07-26 05:48:11.084442] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:56.257 [2024-07-26 05:48:11.084468] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:56.257 [2024-07-26 05:48:11.084513] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:56.257 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.515 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.515 "name": "Existed_Raid", 00:19:56.515 "uuid": "4342f879-0ae3-4f98-afeb-2589e264af94", 00:19:56.515 "strip_size_kb": 64, 00:19:56.515 "state": "offline", 00:19:56.515 "raid_level": "raid0", 00:19:56.515 "superblock": false, 00:19:56.515 "num_base_bdevs": 4, 00:19:56.515 "num_base_bdevs_discovered": 3, 00:19:56.515 "num_base_bdevs_operational": 3, 00:19:56.515 "base_bdevs_list": [ 00:19:56.516 { 00:19:56.516 "name": null, 00:19:56.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:56.516 "is_configured": false, 00:19:56.516 "data_offset": 0, 00:19:56.516 "data_size": 65536 00:19:56.516 }, 00:19:56.516 { 00:19:56.516 "name": "BaseBdev2", 00:19:56.516 "uuid": "257e812a-2808-4fdc-9868-7c0c102d8540", 00:19:56.516 "is_configured": true, 00:19:56.516 "data_offset": 0, 00:19:56.516 "data_size": 65536 00:19:56.516 }, 00:19:56.516 { 00:19:56.516 "name": "BaseBdev3", 00:19:56.516 "uuid": "32d29f07-8b5a-4bb6-a1cc-da2932e3dc19", 00:19:56.516 "is_configured": true, 00:19:56.516 "data_offset": 0, 00:19:56.516 "data_size": 65536 00:19:56.516 }, 00:19:56.516 { 00:19:56.516 "name": "BaseBdev4", 00:19:56.516 "uuid": "b40e3f3e-a5e9-4133-a634-07225ea46d66", 00:19:56.516 "is_configured": true, 00:19:56.516 "data_offset": 0, 00:19:56.516 "data_size": 65536 00:19:56.516 } 00:19:56.516 ] 00:19:56.516 }' 00:19:56.516 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.516 05:48:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:57.082 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:57.082 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:57.082 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.082 05:48:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:57.340 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:57.340 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:57.340 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:57.906 [2024-07-26 05:48:12.693740] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:57.906 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:57.906 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:57.906 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.906 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:58.164 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:58.164 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:58.164 05:48:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:58.729 [2024-07-26 05:48:13.462335] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:58.729 05:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:58.729 05:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:58.729 05:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.729 05:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:58.986 05:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:58.986 05:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:58.986 05:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:59.244 [2024-07-26 05:48:13.976149] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:59.244 [2024-07-26 05:48:13.976188] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf6350 name Existed_Raid, state offline 00:19:59.244 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:59.244 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:59.244 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.244 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:59.503 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:59.503 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:59.503 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:59.503 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:59.503 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:59.503 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:59.761 BaseBdev2 00:19:59.761 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:59.761 05:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:59.761 05:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:59.761 05:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:59.761 05:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:59.761 05:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:59.761 05:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:00.019 05:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:00.277 [ 00:20:00.277 { 00:20:00.277 "name": "BaseBdev2", 00:20:00.277 "aliases": [ 00:20:00.277 "c8217333-71cd-4461-b668-3bf0d4052242" 00:20:00.277 ], 00:20:00.277 "product_name": "Malloc disk", 00:20:00.277 "block_size": 512, 00:20:00.277 "num_blocks": 65536, 00:20:00.277 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:00.277 "assigned_rate_limits": { 00:20:00.277 "rw_ios_per_sec": 0, 00:20:00.277 "rw_mbytes_per_sec": 0, 00:20:00.277 "r_mbytes_per_sec": 0, 00:20:00.277 "w_mbytes_per_sec": 0 00:20:00.277 }, 00:20:00.277 "claimed": false, 00:20:00.277 "zoned": false, 00:20:00.277 "supported_io_types": { 00:20:00.277 "read": true, 00:20:00.277 "write": true, 00:20:00.277 "unmap": true, 00:20:00.277 "flush": true, 00:20:00.277 "reset": true, 00:20:00.277 "nvme_admin": false, 00:20:00.277 "nvme_io": false, 00:20:00.277 "nvme_io_md": false, 00:20:00.277 "write_zeroes": true, 00:20:00.277 "zcopy": true, 00:20:00.277 "get_zone_info": false, 00:20:00.277 "zone_management": false, 00:20:00.277 "zone_append": false, 00:20:00.277 "compare": false, 00:20:00.277 "compare_and_write": false, 00:20:00.277 "abort": true, 00:20:00.277 "seek_hole": false, 00:20:00.277 "seek_data": false, 00:20:00.277 "copy": true, 00:20:00.277 "nvme_iov_md": false 00:20:00.277 }, 00:20:00.277 "memory_domains": [ 00:20:00.277 { 00:20:00.277 "dma_device_id": "system", 00:20:00.277 "dma_device_type": 1 00:20:00.277 }, 00:20:00.277 { 00:20:00.277 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:00.277 "dma_device_type": 2 00:20:00.277 } 00:20:00.277 ], 00:20:00.277 "driver_specific": {} 00:20:00.277 } 00:20:00.277 ] 00:20:00.277 05:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:00.277 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:00.277 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:00.277 05:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:00.536 BaseBdev3 00:20:00.536 05:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:00.536 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:00.536 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:00.536 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:00.536 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:00.536 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:00.536 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:00.794 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:01.056 [ 00:20:01.056 { 00:20:01.056 "name": "BaseBdev3", 00:20:01.056 "aliases": [ 00:20:01.056 "f1984a25-abdd-4fa0-8e40-cd8de4efae26" 00:20:01.056 ], 00:20:01.056 "product_name": "Malloc disk", 00:20:01.056 "block_size": 512, 00:20:01.056 "num_blocks": 65536, 00:20:01.056 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:01.056 "assigned_rate_limits": { 00:20:01.056 "rw_ios_per_sec": 0, 00:20:01.056 "rw_mbytes_per_sec": 0, 00:20:01.056 "r_mbytes_per_sec": 0, 00:20:01.056 "w_mbytes_per_sec": 0 00:20:01.056 }, 00:20:01.056 "claimed": false, 00:20:01.056 "zoned": false, 00:20:01.056 "supported_io_types": { 00:20:01.056 "read": true, 00:20:01.056 "write": true, 00:20:01.056 "unmap": true, 00:20:01.056 "flush": true, 00:20:01.056 "reset": true, 00:20:01.056 "nvme_admin": false, 00:20:01.056 "nvme_io": false, 00:20:01.056 "nvme_io_md": false, 00:20:01.056 "write_zeroes": true, 00:20:01.056 "zcopy": true, 00:20:01.056 "get_zone_info": false, 00:20:01.056 "zone_management": false, 00:20:01.056 "zone_append": false, 00:20:01.056 "compare": false, 00:20:01.056 "compare_and_write": false, 00:20:01.056 "abort": true, 00:20:01.056 "seek_hole": false, 00:20:01.056 "seek_data": false, 00:20:01.056 "copy": true, 00:20:01.056 "nvme_iov_md": false 00:20:01.056 }, 00:20:01.056 "memory_domains": [ 00:20:01.056 { 00:20:01.056 "dma_device_id": "system", 00:20:01.056 "dma_device_type": 1 00:20:01.056 }, 00:20:01.056 { 00:20:01.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.056 "dma_device_type": 2 00:20:01.056 } 00:20:01.056 ], 00:20:01.056 "driver_specific": {} 00:20:01.056 } 00:20:01.056 ] 00:20:01.056 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:01.056 05:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:01.056 05:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:01.056 05:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:01.056 BaseBdev4 00:20:01.344 05:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:01.344 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:01.345 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:01.345 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:01.345 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:01.345 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:01.345 05:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:01.345 05:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:01.617 [ 00:20:01.617 { 00:20:01.617 "name": "BaseBdev4", 00:20:01.617 "aliases": [ 00:20:01.617 "98e8e89c-9010-4a9d-bb75-29ba5f520b8e" 00:20:01.617 ], 00:20:01.617 "product_name": "Malloc disk", 00:20:01.617 "block_size": 512, 00:20:01.617 "num_blocks": 65536, 00:20:01.617 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:01.617 "assigned_rate_limits": { 00:20:01.617 "rw_ios_per_sec": 0, 00:20:01.617 "rw_mbytes_per_sec": 0, 00:20:01.617 "r_mbytes_per_sec": 0, 00:20:01.617 "w_mbytes_per_sec": 0 00:20:01.617 }, 00:20:01.617 "claimed": false, 00:20:01.617 "zoned": false, 00:20:01.617 "supported_io_types": { 00:20:01.617 "read": true, 00:20:01.617 "write": true, 00:20:01.617 "unmap": true, 00:20:01.617 "flush": true, 00:20:01.617 "reset": true, 00:20:01.617 "nvme_admin": false, 00:20:01.617 "nvme_io": false, 00:20:01.617 "nvme_io_md": false, 00:20:01.617 "write_zeroes": true, 00:20:01.617 "zcopy": true, 00:20:01.617 "get_zone_info": false, 00:20:01.617 "zone_management": false, 00:20:01.617 "zone_append": false, 00:20:01.617 "compare": false, 00:20:01.617 "compare_and_write": false, 00:20:01.617 "abort": true, 00:20:01.617 "seek_hole": false, 00:20:01.617 "seek_data": false, 00:20:01.617 "copy": true, 00:20:01.617 "nvme_iov_md": false 00:20:01.617 }, 00:20:01.617 "memory_domains": [ 00:20:01.617 { 00:20:01.617 "dma_device_id": "system", 00:20:01.617 "dma_device_type": 1 00:20:01.617 }, 00:20:01.617 { 00:20:01.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.617 "dma_device_type": 2 00:20:01.617 } 00:20:01.617 ], 00:20:01.617 "driver_specific": {} 00:20:01.617 } 00:20:01.617 ] 00:20:01.617 05:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:01.617 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:01.617 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:01.617 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:01.875 [2024-07-26 05:48:16.647219] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:01.875 [2024-07-26 05:48:16.647258] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:01.875 [2024-07-26 05:48:16.647277] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:01.875 [2024-07-26 05:48:16.648582] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:01.875 [2024-07-26 05:48:16.648623] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:01.875 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:01.875 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:01.875 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:01.875 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:01.875 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:01.876 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:01.876 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.876 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.876 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.876 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.876 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.876 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:02.134 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.134 "name": "Existed_Raid", 00:20:02.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.134 "strip_size_kb": 64, 00:20:02.134 "state": "configuring", 00:20:02.134 "raid_level": "raid0", 00:20:02.134 "superblock": false, 00:20:02.134 "num_base_bdevs": 4, 00:20:02.134 "num_base_bdevs_discovered": 3, 00:20:02.134 "num_base_bdevs_operational": 4, 00:20:02.134 "base_bdevs_list": [ 00:20:02.134 { 00:20:02.134 "name": "BaseBdev1", 00:20:02.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.134 "is_configured": false, 00:20:02.134 "data_offset": 0, 00:20:02.134 "data_size": 0 00:20:02.134 }, 00:20:02.134 { 00:20:02.134 "name": "BaseBdev2", 00:20:02.134 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:02.134 "is_configured": true, 00:20:02.134 "data_offset": 0, 00:20:02.134 "data_size": 65536 00:20:02.134 }, 00:20:02.134 { 00:20:02.134 "name": "BaseBdev3", 00:20:02.134 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:02.134 "is_configured": true, 00:20:02.134 "data_offset": 0, 00:20:02.134 "data_size": 65536 00:20:02.134 }, 00:20:02.134 { 00:20:02.134 "name": "BaseBdev4", 00:20:02.134 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:02.134 "is_configured": true, 00:20:02.134 "data_offset": 0, 00:20:02.134 "data_size": 65536 00:20:02.134 } 00:20:02.134 ] 00:20:02.134 }' 00:20:02.134 05:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.134 05:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.700 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:02.959 [2024-07-26 05:48:17.677933] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.959 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:03.218 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.218 "name": "Existed_Raid", 00:20:03.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.218 "strip_size_kb": 64, 00:20:03.218 "state": "configuring", 00:20:03.218 "raid_level": "raid0", 00:20:03.218 "superblock": false, 00:20:03.218 "num_base_bdevs": 4, 00:20:03.218 "num_base_bdevs_discovered": 2, 00:20:03.218 "num_base_bdevs_operational": 4, 00:20:03.218 "base_bdevs_list": [ 00:20:03.218 { 00:20:03.218 "name": "BaseBdev1", 00:20:03.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.218 "is_configured": false, 00:20:03.218 "data_offset": 0, 00:20:03.218 "data_size": 0 00:20:03.218 }, 00:20:03.218 { 00:20:03.218 "name": null, 00:20:03.218 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:03.218 "is_configured": false, 00:20:03.218 "data_offset": 0, 00:20:03.218 "data_size": 65536 00:20:03.218 }, 00:20:03.218 { 00:20:03.218 "name": "BaseBdev3", 00:20:03.218 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:03.218 "is_configured": true, 00:20:03.218 "data_offset": 0, 00:20:03.218 "data_size": 65536 00:20:03.218 }, 00:20:03.218 { 00:20:03.218 "name": "BaseBdev4", 00:20:03.218 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:03.218 "is_configured": true, 00:20:03.218 "data_offset": 0, 00:20:03.218 "data_size": 65536 00:20:03.218 } 00:20:03.218 ] 00:20:03.218 }' 00:20:03.218 05:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.218 05:48:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.787 05:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.787 05:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:04.046 05:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:04.046 05:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:04.046 [2024-07-26 05:48:18.953834] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:04.305 BaseBdev1 00:20:04.305 05:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:04.305 05:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:04.306 05:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:04.306 05:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:04.306 05:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:04.306 05:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:04.306 05:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:04.306 05:48:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:04.564 [ 00:20:04.564 { 00:20:04.564 "name": "BaseBdev1", 00:20:04.564 "aliases": [ 00:20:04.564 "373e279b-4758-4f8a-a253-07e3e50b031f" 00:20:04.564 ], 00:20:04.564 "product_name": "Malloc disk", 00:20:04.564 "block_size": 512, 00:20:04.564 "num_blocks": 65536, 00:20:04.564 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:04.564 "assigned_rate_limits": { 00:20:04.564 "rw_ios_per_sec": 0, 00:20:04.564 "rw_mbytes_per_sec": 0, 00:20:04.564 "r_mbytes_per_sec": 0, 00:20:04.564 "w_mbytes_per_sec": 0 00:20:04.564 }, 00:20:04.564 "claimed": true, 00:20:04.564 "claim_type": "exclusive_write", 00:20:04.564 "zoned": false, 00:20:04.564 "supported_io_types": { 00:20:04.564 "read": true, 00:20:04.564 "write": true, 00:20:04.564 "unmap": true, 00:20:04.564 "flush": true, 00:20:04.564 "reset": true, 00:20:04.564 "nvme_admin": false, 00:20:04.564 "nvme_io": false, 00:20:04.564 "nvme_io_md": false, 00:20:04.564 "write_zeroes": true, 00:20:04.564 "zcopy": true, 00:20:04.564 "get_zone_info": false, 00:20:04.564 "zone_management": false, 00:20:04.564 "zone_append": false, 00:20:04.564 "compare": false, 00:20:04.564 "compare_and_write": false, 00:20:04.564 "abort": true, 00:20:04.564 "seek_hole": false, 00:20:04.564 "seek_data": false, 00:20:04.564 "copy": true, 00:20:04.564 "nvme_iov_md": false 00:20:04.564 }, 00:20:04.564 "memory_domains": [ 00:20:04.564 { 00:20:04.564 "dma_device_id": "system", 00:20:04.564 "dma_device_type": 1 00:20:04.564 }, 00:20:04.564 { 00:20:04.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.564 "dma_device_type": 2 00:20:04.564 } 00:20:04.564 ], 00:20:04.564 "driver_specific": {} 00:20:04.564 } 00:20:04.564 ] 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.564 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:04.823 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.823 "name": "Existed_Raid", 00:20:04.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.823 "strip_size_kb": 64, 00:20:04.823 "state": "configuring", 00:20:04.823 "raid_level": "raid0", 00:20:04.824 "superblock": false, 00:20:04.824 "num_base_bdevs": 4, 00:20:04.824 "num_base_bdevs_discovered": 3, 00:20:04.824 "num_base_bdevs_operational": 4, 00:20:04.824 "base_bdevs_list": [ 00:20:04.824 { 00:20:04.824 "name": "BaseBdev1", 00:20:04.824 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:04.824 "is_configured": true, 00:20:04.824 "data_offset": 0, 00:20:04.824 "data_size": 65536 00:20:04.824 }, 00:20:04.824 { 00:20:04.824 "name": null, 00:20:04.824 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:04.824 "is_configured": false, 00:20:04.824 "data_offset": 0, 00:20:04.824 "data_size": 65536 00:20:04.824 }, 00:20:04.824 { 00:20:04.824 "name": "BaseBdev3", 00:20:04.824 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:04.824 "is_configured": true, 00:20:04.824 "data_offset": 0, 00:20:04.824 "data_size": 65536 00:20:04.824 }, 00:20:04.824 { 00:20:04.824 "name": "BaseBdev4", 00:20:04.824 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:04.824 "is_configured": true, 00:20:04.824 "data_offset": 0, 00:20:04.824 "data_size": 65536 00:20:04.824 } 00:20:04.824 ] 00:20:04.824 }' 00:20:04.824 05:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.824 05:48:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:05.390 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.649 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:05.908 [2024-07-26 05:48:20.778705] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:05.908 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:05.909 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:05.909 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:05.909 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.909 05:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:06.168 05:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.168 "name": "Existed_Raid", 00:20:06.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.168 "strip_size_kb": 64, 00:20:06.168 "state": "configuring", 00:20:06.168 "raid_level": "raid0", 00:20:06.168 "superblock": false, 00:20:06.168 "num_base_bdevs": 4, 00:20:06.168 "num_base_bdevs_discovered": 2, 00:20:06.168 "num_base_bdevs_operational": 4, 00:20:06.168 "base_bdevs_list": [ 00:20:06.168 { 00:20:06.168 "name": "BaseBdev1", 00:20:06.168 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:06.168 "is_configured": true, 00:20:06.168 "data_offset": 0, 00:20:06.168 "data_size": 65536 00:20:06.168 }, 00:20:06.168 { 00:20:06.168 "name": null, 00:20:06.168 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:06.168 "is_configured": false, 00:20:06.168 "data_offset": 0, 00:20:06.168 "data_size": 65536 00:20:06.168 }, 00:20:06.168 { 00:20:06.168 "name": null, 00:20:06.168 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:06.168 "is_configured": false, 00:20:06.168 "data_offset": 0, 00:20:06.168 "data_size": 65536 00:20:06.168 }, 00:20:06.168 { 00:20:06.168 "name": "BaseBdev4", 00:20:06.168 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:06.168 "is_configured": true, 00:20:06.168 "data_offset": 0, 00:20:06.168 "data_size": 65536 00:20:06.168 } 00:20:06.168 ] 00:20:06.168 }' 00:20:06.168 05:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.168 05:48:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.105 05:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.105 05:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:07.105 05:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:07.105 05:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:07.364 [2024-07-26 05:48:22.122341] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.364 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.623 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.623 "name": "Existed_Raid", 00:20:07.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.623 "strip_size_kb": 64, 00:20:07.623 "state": "configuring", 00:20:07.623 "raid_level": "raid0", 00:20:07.623 "superblock": false, 00:20:07.623 "num_base_bdevs": 4, 00:20:07.623 "num_base_bdevs_discovered": 3, 00:20:07.623 "num_base_bdevs_operational": 4, 00:20:07.623 "base_bdevs_list": [ 00:20:07.623 { 00:20:07.623 "name": "BaseBdev1", 00:20:07.623 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:07.623 "is_configured": true, 00:20:07.623 "data_offset": 0, 00:20:07.623 "data_size": 65536 00:20:07.623 }, 00:20:07.623 { 00:20:07.623 "name": null, 00:20:07.623 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:07.623 "is_configured": false, 00:20:07.623 "data_offset": 0, 00:20:07.623 "data_size": 65536 00:20:07.623 }, 00:20:07.623 { 00:20:07.623 "name": "BaseBdev3", 00:20:07.623 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:07.623 "is_configured": true, 00:20:07.623 "data_offset": 0, 00:20:07.623 "data_size": 65536 00:20:07.623 }, 00:20:07.623 { 00:20:07.623 "name": "BaseBdev4", 00:20:07.623 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:07.623 "is_configured": true, 00:20:07.623 "data_offset": 0, 00:20:07.623 "data_size": 65536 00:20:07.623 } 00:20:07.623 ] 00:20:07.623 }' 00:20:07.623 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.623 05:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:08.191 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.191 05:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:08.449 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:08.449 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:08.707 [2024-07-26 05:48:23.454039] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.707 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:08.966 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.966 "name": "Existed_Raid", 00:20:08.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.966 "strip_size_kb": 64, 00:20:08.966 "state": "configuring", 00:20:08.966 "raid_level": "raid0", 00:20:08.966 "superblock": false, 00:20:08.966 "num_base_bdevs": 4, 00:20:08.966 "num_base_bdevs_discovered": 2, 00:20:08.966 "num_base_bdevs_operational": 4, 00:20:08.966 "base_bdevs_list": [ 00:20:08.966 { 00:20:08.966 "name": null, 00:20:08.966 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:08.966 "is_configured": false, 00:20:08.966 "data_offset": 0, 00:20:08.966 "data_size": 65536 00:20:08.966 }, 00:20:08.966 { 00:20:08.966 "name": null, 00:20:08.966 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:08.966 "is_configured": false, 00:20:08.966 "data_offset": 0, 00:20:08.966 "data_size": 65536 00:20:08.966 }, 00:20:08.966 { 00:20:08.966 "name": "BaseBdev3", 00:20:08.966 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:08.966 "is_configured": true, 00:20:08.966 "data_offset": 0, 00:20:08.966 "data_size": 65536 00:20:08.966 }, 00:20:08.966 { 00:20:08.966 "name": "BaseBdev4", 00:20:08.966 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:08.966 "is_configured": true, 00:20:08.966 "data_offset": 0, 00:20:08.966 "data_size": 65536 00:20:08.966 } 00:20:08.966 ] 00:20:08.966 }' 00:20:08.966 05:48:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.966 05:48:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:09.534 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.534 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:09.793 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:09.793 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:10.052 [2024-07-26 05:48:24.757808] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.052 05:48:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:10.311 05:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.311 "name": "Existed_Raid", 00:20:10.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:10.311 "strip_size_kb": 64, 00:20:10.311 "state": "configuring", 00:20:10.311 "raid_level": "raid0", 00:20:10.311 "superblock": false, 00:20:10.311 "num_base_bdevs": 4, 00:20:10.311 "num_base_bdevs_discovered": 3, 00:20:10.311 "num_base_bdevs_operational": 4, 00:20:10.311 "base_bdevs_list": [ 00:20:10.311 { 00:20:10.311 "name": null, 00:20:10.311 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:10.311 "is_configured": false, 00:20:10.311 "data_offset": 0, 00:20:10.311 "data_size": 65536 00:20:10.311 }, 00:20:10.311 { 00:20:10.311 "name": "BaseBdev2", 00:20:10.311 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:10.311 "is_configured": true, 00:20:10.311 "data_offset": 0, 00:20:10.311 "data_size": 65536 00:20:10.311 }, 00:20:10.311 { 00:20:10.311 "name": "BaseBdev3", 00:20:10.311 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:10.311 "is_configured": true, 00:20:10.311 "data_offset": 0, 00:20:10.311 "data_size": 65536 00:20:10.311 }, 00:20:10.311 { 00:20:10.311 "name": "BaseBdev4", 00:20:10.311 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:10.311 "is_configured": true, 00:20:10.311 "data_offset": 0, 00:20:10.311 "data_size": 65536 00:20:10.311 } 00:20:10.311 ] 00:20:10.311 }' 00:20:10.311 05:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.311 05:48:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:10.899 05:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.899 05:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:11.158 05:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:11.158 05:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.158 05:48:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:11.417 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 373e279b-4758-4f8a-a253-07e3e50b031f 00:20:11.676 [2024-07-26 05:48:26.353339] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:11.676 [2024-07-26 05:48:26.353376] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cfa040 00:20:11.676 [2024-07-26 05:48:26.353384] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:11.676 [2024-07-26 05:48:26.353577] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf5a70 00:20:11.676 [2024-07-26 05:48:26.353699] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cfa040 00:20:11.676 [2024-07-26 05:48:26.353710] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cfa040 00:20:11.676 [2024-07-26 05:48:26.353867] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:11.676 NewBaseBdev 00:20:11.676 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:11.676 05:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:11.676 05:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:11.676 05:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:11.676 05:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:11.676 05:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:11.676 05:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:11.934 05:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:12.194 [ 00:20:12.194 { 00:20:12.194 "name": "NewBaseBdev", 00:20:12.194 "aliases": [ 00:20:12.194 "373e279b-4758-4f8a-a253-07e3e50b031f" 00:20:12.194 ], 00:20:12.194 "product_name": "Malloc disk", 00:20:12.194 "block_size": 512, 00:20:12.194 "num_blocks": 65536, 00:20:12.194 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:12.194 "assigned_rate_limits": { 00:20:12.194 "rw_ios_per_sec": 0, 00:20:12.194 "rw_mbytes_per_sec": 0, 00:20:12.194 "r_mbytes_per_sec": 0, 00:20:12.194 "w_mbytes_per_sec": 0 00:20:12.194 }, 00:20:12.194 "claimed": true, 00:20:12.194 "claim_type": "exclusive_write", 00:20:12.194 "zoned": false, 00:20:12.194 "supported_io_types": { 00:20:12.194 "read": true, 00:20:12.194 "write": true, 00:20:12.194 "unmap": true, 00:20:12.194 "flush": true, 00:20:12.194 "reset": true, 00:20:12.194 "nvme_admin": false, 00:20:12.194 "nvme_io": false, 00:20:12.194 "nvme_io_md": false, 00:20:12.194 "write_zeroes": true, 00:20:12.194 "zcopy": true, 00:20:12.194 "get_zone_info": false, 00:20:12.194 "zone_management": false, 00:20:12.194 "zone_append": false, 00:20:12.194 "compare": false, 00:20:12.194 "compare_and_write": false, 00:20:12.194 "abort": true, 00:20:12.194 "seek_hole": false, 00:20:12.194 "seek_data": false, 00:20:12.194 "copy": true, 00:20:12.194 "nvme_iov_md": false 00:20:12.194 }, 00:20:12.194 "memory_domains": [ 00:20:12.194 { 00:20:12.194 "dma_device_id": "system", 00:20:12.194 "dma_device_type": 1 00:20:12.194 }, 00:20:12.194 { 00:20:12.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:12.194 "dma_device_type": 2 00:20:12.194 } 00:20:12.194 ], 00:20:12.194 "driver_specific": {} 00:20:12.194 } 00:20:12.194 ] 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.194 05:48:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:12.453 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.453 "name": "Existed_Raid", 00:20:12.453 "uuid": "e94c3981-3303-434e-81fe-7d9c1e3181f7", 00:20:12.453 "strip_size_kb": 64, 00:20:12.453 "state": "online", 00:20:12.453 "raid_level": "raid0", 00:20:12.453 "superblock": false, 00:20:12.453 "num_base_bdevs": 4, 00:20:12.453 "num_base_bdevs_discovered": 4, 00:20:12.453 "num_base_bdevs_operational": 4, 00:20:12.453 "base_bdevs_list": [ 00:20:12.453 { 00:20:12.453 "name": "NewBaseBdev", 00:20:12.453 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:12.453 "is_configured": true, 00:20:12.453 "data_offset": 0, 00:20:12.453 "data_size": 65536 00:20:12.453 }, 00:20:12.453 { 00:20:12.453 "name": "BaseBdev2", 00:20:12.453 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:12.453 "is_configured": true, 00:20:12.453 "data_offset": 0, 00:20:12.453 "data_size": 65536 00:20:12.453 }, 00:20:12.453 { 00:20:12.453 "name": "BaseBdev3", 00:20:12.453 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:12.453 "is_configured": true, 00:20:12.453 "data_offset": 0, 00:20:12.453 "data_size": 65536 00:20:12.453 }, 00:20:12.453 { 00:20:12.453 "name": "BaseBdev4", 00:20:12.453 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:12.453 "is_configured": true, 00:20:12.453 "data_offset": 0, 00:20:12.453 "data_size": 65536 00:20:12.453 } 00:20:12.453 ] 00:20:12.453 }' 00:20:12.453 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.453 05:48:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.021 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:13.021 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:13.021 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:13.021 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:13.021 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:13.021 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:13.021 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:13.021 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:13.280 [2024-07-26 05:48:27.941880] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:13.280 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:13.280 "name": "Existed_Raid", 00:20:13.280 "aliases": [ 00:20:13.280 "e94c3981-3303-434e-81fe-7d9c1e3181f7" 00:20:13.280 ], 00:20:13.280 "product_name": "Raid Volume", 00:20:13.280 "block_size": 512, 00:20:13.280 "num_blocks": 262144, 00:20:13.280 "uuid": "e94c3981-3303-434e-81fe-7d9c1e3181f7", 00:20:13.280 "assigned_rate_limits": { 00:20:13.280 "rw_ios_per_sec": 0, 00:20:13.280 "rw_mbytes_per_sec": 0, 00:20:13.280 "r_mbytes_per_sec": 0, 00:20:13.280 "w_mbytes_per_sec": 0 00:20:13.280 }, 00:20:13.280 "claimed": false, 00:20:13.280 "zoned": false, 00:20:13.280 "supported_io_types": { 00:20:13.280 "read": true, 00:20:13.280 "write": true, 00:20:13.280 "unmap": true, 00:20:13.280 "flush": true, 00:20:13.280 "reset": true, 00:20:13.280 "nvme_admin": false, 00:20:13.280 "nvme_io": false, 00:20:13.280 "nvme_io_md": false, 00:20:13.280 "write_zeroes": true, 00:20:13.280 "zcopy": false, 00:20:13.280 "get_zone_info": false, 00:20:13.280 "zone_management": false, 00:20:13.280 "zone_append": false, 00:20:13.280 "compare": false, 00:20:13.280 "compare_and_write": false, 00:20:13.280 "abort": false, 00:20:13.280 "seek_hole": false, 00:20:13.280 "seek_data": false, 00:20:13.280 "copy": false, 00:20:13.280 "nvme_iov_md": false 00:20:13.280 }, 00:20:13.280 "memory_domains": [ 00:20:13.280 { 00:20:13.280 "dma_device_id": "system", 00:20:13.280 "dma_device_type": 1 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.280 "dma_device_type": 2 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "dma_device_id": "system", 00:20:13.280 "dma_device_type": 1 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.280 "dma_device_type": 2 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "dma_device_id": "system", 00:20:13.280 "dma_device_type": 1 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.280 "dma_device_type": 2 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "dma_device_id": "system", 00:20:13.280 "dma_device_type": 1 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.280 "dma_device_type": 2 00:20:13.280 } 00:20:13.280 ], 00:20:13.280 "driver_specific": { 00:20:13.280 "raid": { 00:20:13.280 "uuid": "e94c3981-3303-434e-81fe-7d9c1e3181f7", 00:20:13.280 "strip_size_kb": 64, 00:20:13.280 "state": "online", 00:20:13.280 "raid_level": "raid0", 00:20:13.280 "superblock": false, 00:20:13.280 "num_base_bdevs": 4, 00:20:13.280 "num_base_bdevs_discovered": 4, 00:20:13.280 "num_base_bdevs_operational": 4, 00:20:13.280 "base_bdevs_list": [ 00:20:13.280 { 00:20:13.280 "name": "NewBaseBdev", 00:20:13.280 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:13.280 "is_configured": true, 00:20:13.280 "data_offset": 0, 00:20:13.280 "data_size": 65536 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "name": "BaseBdev2", 00:20:13.280 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:13.280 "is_configured": true, 00:20:13.280 "data_offset": 0, 00:20:13.280 "data_size": 65536 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "name": "BaseBdev3", 00:20:13.280 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:13.280 "is_configured": true, 00:20:13.280 "data_offset": 0, 00:20:13.280 "data_size": 65536 00:20:13.280 }, 00:20:13.280 { 00:20:13.280 "name": "BaseBdev4", 00:20:13.280 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:13.280 "is_configured": true, 00:20:13.280 "data_offset": 0, 00:20:13.280 "data_size": 65536 00:20:13.280 } 00:20:13.280 ] 00:20:13.280 } 00:20:13.280 } 00:20:13.280 }' 00:20:13.280 05:48:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:13.280 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:13.280 BaseBdev2 00:20:13.280 BaseBdev3 00:20:13.280 BaseBdev4' 00:20:13.280 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:13.280 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:13.280 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:13.538 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:13.538 "name": "NewBaseBdev", 00:20:13.538 "aliases": [ 00:20:13.538 "373e279b-4758-4f8a-a253-07e3e50b031f" 00:20:13.538 ], 00:20:13.538 "product_name": "Malloc disk", 00:20:13.538 "block_size": 512, 00:20:13.538 "num_blocks": 65536, 00:20:13.538 "uuid": "373e279b-4758-4f8a-a253-07e3e50b031f", 00:20:13.538 "assigned_rate_limits": { 00:20:13.538 "rw_ios_per_sec": 0, 00:20:13.538 "rw_mbytes_per_sec": 0, 00:20:13.538 "r_mbytes_per_sec": 0, 00:20:13.538 "w_mbytes_per_sec": 0 00:20:13.538 }, 00:20:13.538 "claimed": true, 00:20:13.538 "claim_type": "exclusive_write", 00:20:13.538 "zoned": false, 00:20:13.538 "supported_io_types": { 00:20:13.538 "read": true, 00:20:13.538 "write": true, 00:20:13.538 "unmap": true, 00:20:13.538 "flush": true, 00:20:13.538 "reset": true, 00:20:13.538 "nvme_admin": false, 00:20:13.538 "nvme_io": false, 00:20:13.538 "nvme_io_md": false, 00:20:13.538 "write_zeroes": true, 00:20:13.538 "zcopy": true, 00:20:13.538 "get_zone_info": false, 00:20:13.538 "zone_management": false, 00:20:13.538 "zone_append": false, 00:20:13.538 "compare": false, 00:20:13.538 "compare_and_write": false, 00:20:13.538 "abort": true, 00:20:13.538 "seek_hole": false, 00:20:13.538 "seek_data": false, 00:20:13.538 "copy": true, 00:20:13.538 "nvme_iov_md": false 00:20:13.538 }, 00:20:13.538 "memory_domains": [ 00:20:13.538 { 00:20:13.538 "dma_device_id": "system", 00:20:13.538 "dma_device_type": 1 00:20:13.538 }, 00:20:13.538 { 00:20:13.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.538 "dma_device_type": 2 00:20:13.538 } 00:20:13.538 ], 00:20:13.538 "driver_specific": {} 00:20:13.538 }' 00:20:13.538 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:13.538 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:13.538 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:13.538 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:13.538 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:13.538 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:13.538 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:13.796 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:13.796 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:13.796 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.796 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.796 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:13.796 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:13.796 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:13.796 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:14.053 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:14.053 "name": "BaseBdev2", 00:20:14.053 "aliases": [ 00:20:14.053 "c8217333-71cd-4461-b668-3bf0d4052242" 00:20:14.053 ], 00:20:14.053 "product_name": "Malloc disk", 00:20:14.053 "block_size": 512, 00:20:14.053 "num_blocks": 65536, 00:20:14.053 "uuid": "c8217333-71cd-4461-b668-3bf0d4052242", 00:20:14.053 "assigned_rate_limits": { 00:20:14.053 "rw_ios_per_sec": 0, 00:20:14.053 "rw_mbytes_per_sec": 0, 00:20:14.054 "r_mbytes_per_sec": 0, 00:20:14.054 "w_mbytes_per_sec": 0 00:20:14.054 }, 00:20:14.054 "claimed": true, 00:20:14.054 "claim_type": "exclusive_write", 00:20:14.054 "zoned": false, 00:20:14.054 "supported_io_types": { 00:20:14.054 "read": true, 00:20:14.054 "write": true, 00:20:14.054 "unmap": true, 00:20:14.054 "flush": true, 00:20:14.054 "reset": true, 00:20:14.054 "nvme_admin": false, 00:20:14.054 "nvme_io": false, 00:20:14.054 "nvme_io_md": false, 00:20:14.054 "write_zeroes": true, 00:20:14.054 "zcopy": true, 00:20:14.054 "get_zone_info": false, 00:20:14.054 "zone_management": false, 00:20:14.054 "zone_append": false, 00:20:14.054 "compare": false, 00:20:14.054 "compare_and_write": false, 00:20:14.054 "abort": true, 00:20:14.054 "seek_hole": false, 00:20:14.054 "seek_data": false, 00:20:14.054 "copy": true, 00:20:14.054 "nvme_iov_md": false 00:20:14.054 }, 00:20:14.054 "memory_domains": [ 00:20:14.054 { 00:20:14.054 "dma_device_id": "system", 00:20:14.054 "dma_device_type": 1 00:20:14.054 }, 00:20:14.054 { 00:20:14.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.054 "dma_device_type": 2 00:20:14.054 } 00:20:14.054 ], 00:20:14.054 "driver_specific": {} 00:20:14.054 }' 00:20:14.054 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.054 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.054 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:14.054 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.054 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.054 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:14.054 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.054 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.312 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:14.312 05:48:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.312 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.312 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:14.312 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:14.312 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:14.312 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:14.570 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:14.570 "name": "BaseBdev3", 00:20:14.570 "aliases": [ 00:20:14.570 "f1984a25-abdd-4fa0-8e40-cd8de4efae26" 00:20:14.570 ], 00:20:14.570 "product_name": "Malloc disk", 00:20:14.570 "block_size": 512, 00:20:14.570 "num_blocks": 65536, 00:20:14.570 "uuid": "f1984a25-abdd-4fa0-8e40-cd8de4efae26", 00:20:14.570 "assigned_rate_limits": { 00:20:14.570 "rw_ios_per_sec": 0, 00:20:14.570 "rw_mbytes_per_sec": 0, 00:20:14.570 "r_mbytes_per_sec": 0, 00:20:14.570 "w_mbytes_per_sec": 0 00:20:14.570 }, 00:20:14.570 "claimed": true, 00:20:14.570 "claim_type": "exclusive_write", 00:20:14.570 "zoned": false, 00:20:14.570 "supported_io_types": { 00:20:14.570 "read": true, 00:20:14.571 "write": true, 00:20:14.571 "unmap": true, 00:20:14.571 "flush": true, 00:20:14.571 "reset": true, 00:20:14.571 "nvme_admin": false, 00:20:14.571 "nvme_io": false, 00:20:14.571 "nvme_io_md": false, 00:20:14.571 "write_zeroes": true, 00:20:14.571 "zcopy": true, 00:20:14.571 "get_zone_info": false, 00:20:14.571 "zone_management": false, 00:20:14.571 "zone_append": false, 00:20:14.571 "compare": false, 00:20:14.571 "compare_and_write": false, 00:20:14.571 "abort": true, 00:20:14.571 "seek_hole": false, 00:20:14.571 "seek_data": false, 00:20:14.571 "copy": true, 00:20:14.571 "nvme_iov_md": false 00:20:14.571 }, 00:20:14.571 "memory_domains": [ 00:20:14.571 { 00:20:14.571 "dma_device_id": "system", 00:20:14.571 "dma_device_type": 1 00:20:14.571 }, 00:20:14.571 { 00:20:14.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.571 "dma_device_type": 2 00:20:14.571 } 00:20:14.571 ], 00:20:14.571 "driver_specific": {} 00:20:14.571 }' 00:20:14.571 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.571 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:14.571 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:14.571 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.571 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:14.571 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:14.571 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.829 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:14.829 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:14.829 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.829 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:14.829 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:14.829 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:14.829 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:14.829 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:15.087 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:15.087 "name": "BaseBdev4", 00:20:15.087 "aliases": [ 00:20:15.087 "98e8e89c-9010-4a9d-bb75-29ba5f520b8e" 00:20:15.087 ], 00:20:15.087 "product_name": "Malloc disk", 00:20:15.087 "block_size": 512, 00:20:15.087 "num_blocks": 65536, 00:20:15.087 "uuid": "98e8e89c-9010-4a9d-bb75-29ba5f520b8e", 00:20:15.087 "assigned_rate_limits": { 00:20:15.087 "rw_ios_per_sec": 0, 00:20:15.087 "rw_mbytes_per_sec": 0, 00:20:15.087 "r_mbytes_per_sec": 0, 00:20:15.087 "w_mbytes_per_sec": 0 00:20:15.087 }, 00:20:15.087 "claimed": true, 00:20:15.087 "claim_type": "exclusive_write", 00:20:15.087 "zoned": false, 00:20:15.087 "supported_io_types": { 00:20:15.087 "read": true, 00:20:15.087 "write": true, 00:20:15.087 "unmap": true, 00:20:15.087 "flush": true, 00:20:15.087 "reset": true, 00:20:15.087 "nvme_admin": false, 00:20:15.087 "nvme_io": false, 00:20:15.087 "nvme_io_md": false, 00:20:15.087 "write_zeroes": true, 00:20:15.087 "zcopy": true, 00:20:15.087 "get_zone_info": false, 00:20:15.087 "zone_management": false, 00:20:15.087 "zone_append": false, 00:20:15.087 "compare": false, 00:20:15.087 "compare_and_write": false, 00:20:15.087 "abort": true, 00:20:15.087 "seek_hole": false, 00:20:15.087 "seek_data": false, 00:20:15.087 "copy": true, 00:20:15.087 "nvme_iov_md": false 00:20:15.087 }, 00:20:15.087 "memory_domains": [ 00:20:15.087 { 00:20:15.087 "dma_device_id": "system", 00:20:15.087 "dma_device_type": 1 00:20:15.087 }, 00:20:15.087 { 00:20:15.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.087 "dma_device_type": 2 00:20:15.087 } 00:20:15.087 ], 00:20:15.087 "driver_specific": {} 00:20:15.087 }' 00:20:15.087 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.087 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.087 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.087 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.087 05:48:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.345 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:15.345 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.345 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.345 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:15.345 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.345 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.345 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:15.345 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:15.605 [2024-07-26 05:48:30.416139] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:15.605 [2024-07-26 05:48:30.416165] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:15.605 [2024-07-26 05:48:30.416221] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:15.605 [2024-07-26 05:48:30.416280] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:15.605 [2024-07-26 05:48:30.416292] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfa040 name Existed_Raid, state offline 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1188035 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1188035 ']' 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1188035 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1188035 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1188035' 00:20:15.605 killing process with pid 1188035 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1188035 00:20:15.605 [2024-07-26 05:48:30.487559] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:15.605 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1188035 00:20:15.864 [2024-07-26 05:48:30.525325] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:15.864 05:48:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:15.864 00:20:15.864 real 0m31.970s 00:20:15.864 user 0m58.653s 00:20:15.864 sys 0m5.782s 00:20:15.864 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:15.864 05:48:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:15.864 ************************************ 00:20:15.864 END TEST raid_state_function_test 00:20:15.864 ************************************ 00:20:16.123 05:48:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:16.123 05:48:30 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:20:16.123 05:48:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:16.123 05:48:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:16.123 05:48:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:16.123 ************************************ 00:20:16.123 START TEST raid_state_function_test_sb 00:20:16.123 ************************************ 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1193284 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1193284' 00:20:16.123 Process raid pid: 1193284 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1193284 /var/tmp/spdk-raid.sock 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1193284 ']' 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:16.123 05:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:16.124 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:16.124 05:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:16.124 05:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:16.124 [2024-07-26 05:48:30.892937] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:20:16.124 [2024-07-26 05:48:30.893002] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:16.124 [2024-07-26 05:48:31.014847] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:16.382 [2024-07-26 05:48:31.122293] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:16.382 [2024-07-26 05:48:31.185842] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:16.382 [2024-07-26 05:48:31.185872] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:17.317 05:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:17.317 05:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:17.317 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:17.576 [2024-07-26 05:48:32.317080] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:17.576 [2024-07-26 05:48:32.317120] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:17.576 [2024-07-26 05:48:32.317131] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:17.576 [2024-07-26 05:48:32.317143] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:17.576 [2024-07-26 05:48:32.317152] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:17.576 [2024-07-26 05:48:32.317163] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:17.576 [2024-07-26 05:48:32.317172] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:17.576 [2024-07-26 05:48:32.317183] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:17.576 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.874 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.874 "name": "Existed_Raid", 00:20:17.874 "uuid": "1993e465-2439-4d23-8c5f-3f61e5a3089d", 00:20:17.874 "strip_size_kb": 64, 00:20:17.874 "state": "configuring", 00:20:17.874 "raid_level": "raid0", 00:20:17.874 "superblock": true, 00:20:17.874 "num_base_bdevs": 4, 00:20:17.874 "num_base_bdevs_discovered": 0, 00:20:17.874 "num_base_bdevs_operational": 4, 00:20:17.874 "base_bdevs_list": [ 00:20:17.874 { 00:20:17.874 "name": "BaseBdev1", 00:20:17.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.874 "is_configured": false, 00:20:17.874 "data_offset": 0, 00:20:17.874 "data_size": 0 00:20:17.874 }, 00:20:17.874 { 00:20:17.874 "name": "BaseBdev2", 00:20:17.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.874 "is_configured": false, 00:20:17.874 "data_offset": 0, 00:20:17.874 "data_size": 0 00:20:17.874 }, 00:20:17.874 { 00:20:17.874 "name": "BaseBdev3", 00:20:17.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.874 "is_configured": false, 00:20:17.874 "data_offset": 0, 00:20:17.874 "data_size": 0 00:20:17.874 }, 00:20:17.874 { 00:20:17.874 "name": "BaseBdev4", 00:20:17.874 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:17.874 "is_configured": false, 00:20:17.874 "data_offset": 0, 00:20:17.874 "data_size": 0 00:20:17.874 } 00:20:17.874 ] 00:20:17.874 }' 00:20:17.874 05:48:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.874 05:48:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:18.466 05:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:18.466 [2024-07-26 05:48:33.319582] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:18.466 [2024-07-26 05:48:33.319614] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fcbaa0 name Existed_Raid, state configuring 00:20:18.466 05:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:18.725 [2024-07-26 05:48:33.484054] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:18.725 [2024-07-26 05:48:33.484085] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:18.725 [2024-07-26 05:48:33.484095] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:18.725 [2024-07-26 05:48:33.484106] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:18.725 [2024-07-26 05:48:33.484115] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:18.725 [2024-07-26 05:48:33.484126] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:18.725 [2024-07-26 05:48:33.484134] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:18.725 [2024-07-26 05:48:33.484145] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:18.725 05:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:18.983 [2024-07-26 05:48:33.674448] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:18.983 BaseBdev1 00:20:18.983 05:48:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:18.983 05:48:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:18.983 05:48:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:18.983 05:48:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:18.983 05:48:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:18.983 05:48:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:18.983 05:48:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:19.242 05:48:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:19.242 [ 00:20:19.242 { 00:20:19.242 "name": "BaseBdev1", 00:20:19.242 "aliases": [ 00:20:19.242 "f14e0b61-4de2-41c0-8f35-14a75237f08a" 00:20:19.242 ], 00:20:19.242 "product_name": "Malloc disk", 00:20:19.242 "block_size": 512, 00:20:19.242 "num_blocks": 65536, 00:20:19.242 "uuid": "f14e0b61-4de2-41c0-8f35-14a75237f08a", 00:20:19.242 "assigned_rate_limits": { 00:20:19.242 "rw_ios_per_sec": 0, 00:20:19.242 "rw_mbytes_per_sec": 0, 00:20:19.242 "r_mbytes_per_sec": 0, 00:20:19.242 "w_mbytes_per_sec": 0 00:20:19.242 }, 00:20:19.242 "claimed": true, 00:20:19.242 "claim_type": "exclusive_write", 00:20:19.242 "zoned": false, 00:20:19.242 "supported_io_types": { 00:20:19.242 "read": true, 00:20:19.242 "write": true, 00:20:19.242 "unmap": true, 00:20:19.242 "flush": true, 00:20:19.242 "reset": true, 00:20:19.242 "nvme_admin": false, 00:20:19.242 "nvme_io": false, 00:20:19.242 "nvme_io_md": false, 00:20:19.242 "write_zeroes": true, 00:20:19.242 "zcopy": true, 00:20:19.242 "get_zone_info": false, 00:20:19.242 "zone_management": false, 00:20:19.242 "zone_append": false, 00:20:19.242 "compare": false, 00:20:19.242 "compare_and_write": false, 00:20:19.242 "abort": true, 00:20:19.242 "seek_hole": false, 00:20:19.242 "seek_data": false, 00:20:19.242 "copy": true, 00:20:19.242 "nvme_iov_md": false 00:20:19.242 }, 00:20:19.242 "memory_domains": [ 00:20:19.242 { 00:20:19.242 "dma_device_id": "system", 00:20:19.242 "dma_device_type": 1 00:20:19.242 }, 00:20:19.242 { 00:20:19.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.242 "dma_device_type": 2 00:20:19.242 } 00:20:19.242 ], 00:20:19.242 "driver_specific": {} 00:20:19.242 } 00:20:19.242 ] 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:19.242 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.502 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:19.502 "name": "Existed_Raid", 00:20:19.502 "uuid": "7f57acc4-e425-472d-acd9-2c4632bd5e4f", 00:20:19.502 "strip_size_kb": 64, 00:20:19.502 "state": "configuring", 00:20:19.502 "raid_level": "raid0", 00:20:19.502 "superblock": true, 00:20:19.502 "num_base_bdevs": 4, 00:20:19.502 "num_base_bdevs_discovered": 1, 00:20:19.502 "num_base_bdevs_operational": 4, 00:20:19.502 "base_bdevs_list": [ 00:20:19.502 { 00:20:19.502 "name": "BaseBdev1", 00:20:19.502 "uuid": "f14e0b61-4de2-41c0-8f35-14a75237f08a", 00:20:19.502 "is_configured": true, 00:20:19.502 "data_offset": 2048, 00:20:19.502 "data_size": 63488 00:20:19.502 }, 00:20:19.502 { 00:20:19.502 "name": "BaseBdev2", 00:20:19.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.502 "is_configured": false, 00:20:19.502 "data_offset": 0, 00:20:19.502 "data_size": 0 00:20:19.502 }, 00:20:19.502 { 00:20:19.502 "name": "BaseBdev3", 00:20:19.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.502 "is_configured": false, 00:20:19.502 "data_offset": 0, 00:20:19.502 "data_size": 0 00:20:19.502 }, 00:20:19.502 { 00:20:19.502 "name": "BaseBdev4", 00:20:19.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.502 "is_configured": false, 00:20:19.502 "data_offset": 0, 00:20:19.502 "data_size": 0 00:20:19.502 } 00:20:19.502 ] 00:20:19.502 }' 00:20:19.502 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:19.502 05:48:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:20.069 05:48:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:20.327 [2024-07-26 05:48:35.102225] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:20.327 [2024-07-26 05:48:35.102264] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fcb310 name Existed_Raid, state configuring 00:20:20.327 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:20.586 [2024-07-26 05:48:35.350929] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:20.586 [2024-07-26 05:48:35.352417] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:20.586 [2024-07-26 05:48:35.352449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:20.586 [2024-07-26 05:48:35.352459] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:20.586 [2024-07-26 05:48:35.352471] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:20.586 [2024-07-26 05:48:35.352480] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:20.586 [2024-07-26 05:48:35.352491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.586 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:20.846 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.846 "name": "Existed_Raid", 00:20:20.846 "uuid": "e91a675d-2251-4a8f-89f2-178f4bc78242", 00:20:20.846 "strip_size_kb": 64, 00:20:20.846 "state": "configuring", 00:20:20.846 "raid_level": "raid0", 00:20:20.846 "superblock": true, 00:20:20.846 "num_base_bdevs": 4, 00:20:20.846 "num_base_bdevs_discovered": 1, 00:20:20.846 "num_base_bdevs_operational": 4, 00:20:20.846 "base_bdevs_list": [ 00:20:20.846 { 00:20:20.846 "name": "BaseBdev1", 00:20:20.846 "uuid": "f14e0b61-4de2-41c0-8f35-14a75237f08a", 00:20:20.846 "is_configured": true, 00:20:20.846 "data_offset": 2048, 00:20:20.846 "data_size": 63488 00:20:20.846 }, 00:20:20.846 { 00:20:20.846 "name": "BaseBdev2", 00:20:20.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.846 "is_configured": false, 00:20:20.846 "data_offset": 0, 00:20:20.846 "data_size": 0 00:20:20.846 }, 00:20:20.846 { 00:20:20.847 "name": "BaseBdev3", 00:20:20.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.847 "is_configured": false, 00:20:20.847 "data_offset": 0, 00:20:20.847 "data_size": 0 00:20:20.847 }, 00:20:20.847 { 00:20:20.847 "name": "BaseBdev4", 00:20:20.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.847 "is_configured": false, 00:20:20.847 "data_offset": 0, 00:20:20.847 "data_size": 0 00:20:20.847 } 00:20:20.847 ] 00:20:20.847 }' 00:20:20.847 05:48:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.847 05:48:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:21.784 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:21.784 [2024-07-26 05:48:36.585616] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:21.784 BaseBdev2 00:20:21.784 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:21.784 05:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:21.784 05:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:21.784 05:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:21.784 05:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:21.784 05:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:21.784 05:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:22.042 05:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:22.042 [ 00:20:22.042 { 00:20:22.042 "name": "BaseBdev2", 00:20:22.042 "aliases": [ 00:20:22.042 "d68b575f-e22d-42c6-86b1-b41d7f503649" 00:20:22.042 ], 00:20:22.042 "product_name": "Malloc disk", 00:20:22.042 "block_size": 512, 00:20:22.042 "num_blocks": 65536, 00:20:22.042 "uuid": "d68b575f-e22d-42c6-86b1-b41d7f503649", 00:20:22.042 "assigned_rate_limits": { 00:20:22.042 "rw_ios_per_sec": 0, 00:20:22.042 "rw_mbytes_per_sec": 0, 00:20:22.042 "r_mbytes_per_sec": 0, 00:20:22.042 "w_mbytes_per_sec": 0 00:20:22.042 }, 00:20:22.042 "claimed": true, 00:20:22.042 "claim_type": "exclusive_write", 00:20:22.042 "zoned": false, 00:20:22.042 "supported_io_types": { 00:20:22.042 "read": true, 00:20:22.042 "write": true, 00:20:22.042 "unmap": true, 00:20:22.042 "flush": true, 00:20:22.042 "reset": true, 00:20:22.042 "nvme_admin": false, 00:20:22.042 "nvme_io": false, 00:20:22.042 "nvme_io_md": false, 00:20:22.042 "write_zeroes": true, 00:20:22.042 "zcopy": true, 00:20:22.042 "get_zone_info": false, 00:20:22.042 "zone_management": false, 00:20:22.042 "zone_append": false, 00:20:22.042 "compare": false, 00:20:22.042 "compare_and_write": false, 00:20:22.042 "abort": true, 00:20:22.042 "seek_hole": false, 00:20:22.042 "seek_data": false, 00:20:22.042 "copy": true, 00:20:22.042 "nvme_iov_md": false 00:20:22.042 }, 00:20:22.042 "memory_domains": [ 00:20:22.042 { 00:20:22.042 "dma_device_id": "system", 00:20:22.042 "dma_device_type": 1 00:20:22.042 }, 00:20:22.042 { 00:20:22.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.042 "dma_device_type": 2 00:20:22.042 } 00:20:22.042 ], 00:20:22.042 "driver_specific": {} 00:20:22.042 } 00:20:22.042 ] 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.301 05:48:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:22.301 05:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.301 "name": "Existed_Raid", 00:20:22.301 "uuid": "e91a675d-2251-4a8f-89f2-178f4bc78242", 00:20:22.301 "strip_size_kb": 64, 00:20:22.301 "state": "configuring", 00:20:22.301 "raid_level": "raid0", 00:20:22.301 "superblock": true, 00:20:22.301 "num_base_bdevs": 4, 00:20:22.301 "num_base_bdevs_discovered": 2, 00:20:22.301 "num_base_bdevs_operational": 4, 00:20:22.301 "base_bdevs_list": [ 00:20:22.301 { 00:20:22.301 "name": "BaseBdev1", 00:20:22.301 "uuid": "f14e0b61-4de2-41c0-8f35-14a75237f08a", 00:20:22.301 "is_configured": true, 00:20:22.301 "data_offset": 2048, 00:20:22.301 "data_size": 63488 00:20:22.301 }, 00:20:22.301 { 00:20:22.301 "name": "BaseBdev2", 00:20:22.301 "uuid": "d68b575f-e22d-42c6-86b1-b41d7f503649", 00:20:22.301 "is_configured": true, 00:20:22.301 "data_offset": 2048, 00:20:22.301 "data_size": 63488 00:20:22.301 }, 00:20:22.301 { 00:20:22.301 "name": "BaseBdev3", 00:20:22.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.301 "is_configured": false, 00:20:22.301 "data_offset": 0, 00:20:22.301 "data_size": 0 00:20:22.301 }, 00:20:22.301 { 00:20:22.301 "name": "BaseBdev4", 00:20:22.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.301 "is_configured": false, 00:20:22.301 "data_offset": 0, 00:20:22.301 "data_size": 0 00:20:22.301 } 00:20:22.301 ] 00:20:22.301 }' 00:20:22.301 05:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.301 05:48:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:22.867 05:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:23.125 [2024-07-26 05:48:37.872485] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:23.125 BaseBdev3 00:20:23.125 05:48:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:23.125 05:48:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:23.125 05:48:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:23.125 05:48:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:23.125 05:48:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:23.125 05:48:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:23.125 05:48:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:23.383 05:48:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:23.383 [ 00:20:23.383 { 00:20:23.383 "name": "BaseBdev3", 00:20:23.383 "aliases": [ 00:20:23.383 "28ba8dfd-ef84-4ec2-944f-f994f51a294a" 00:20:23.383 ], 00:20:23.383 "product_name": "Malloc disk", 00:20:23.383 "block_size": 512, 00:20:23.383 "num_blocks": 65536, 00:20:23.383 "uuid": "28ba8dfd-ef84-4ec2-944f-f994f51a294a", 00:20:23.383 "assigned_rate_limits": { 00:20:23.383 "rw_ios_per_sec": 0, 00:20:23.383 "rw_mbytes_per_sec": 0, 00:20:23.383 "r_mbytes_per_sec": 0, 00:20:23.383 "w_mbytes_per_sec": 0 00:20:23.383 }, 00:20:23.383 "claimed": true, 00:20:23.383 "claim_type": "exclusive_write", 00:20:23.383 "zoned": false, 00:20:23.383 "supported_io_types": { 00:20:23.383 "read": true, 00:20:23.383 "write": true, 00:20:23.383 "unmap": true, 00:20:23.383 "flush": true, 00:20:23.383 "reset": true, 00:20:23.383 "nvme_admin": false, 00:20:23.383 "nvme_io": false, 00:20:23.383 "nvme_io_md": false, 00:20:23.383 "write_zeroes": true, 00:20:23.383 "zcopy": true, 00:20:23.383 "get_zone_info": false, 00:20:23.383 "zone_management": false, 00:20:23.383 "zone_append": false, 00:20:23.383 "compare": false, 00:20:23.383 "compare_and_write": false, 00:20:23.383 "abort": true, 00:20:23.383 "seek_hole": false, 00:20:23.383 "seek_data": false, 00:20:23.383 "copy": true, 00:20:23.383 "nvme_iov_md": false 00:20:23.383 }, 00:20:23.383 "memory_domains": [ 00:20:23.383 { 00:20:23.383 "dma_device_id": "system", 00:20:23.383 "dma_device_type": 1 00:20:23.383 }, 00:20:23.383 { 00:20:23.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.383 "dma_device_type": 2 00:20:23.383 } 00:20:23.383 ], 00:20:23.383 "driver_specific": {} 00:20:23.383 } 00:20:23.383 ] 00:20:23.383 05:48:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:23.383 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.384 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:23.642 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:23.642 "name": "Existed_Raid", 00:20:23.642 "uuid": "e91a675d-2251-4a8f-89f2-178f4bc78242", 00:20:23.642 "strip_size_kb": 64, 00:20:23.642 "state": "configuring", 00:20:23.642 "raid_level": "raid0", 00:20:23.642 "superblock": true, 00:20:23.642 "num_base_bdevs": 4, 00:20:23.642 "num_base_bdevs_discovered": 3, 00:20:23.642 "num_base_bdevs_operational": 4, 00:20:23.642 "base_bdevs_list": [ 00:20:23.642 { 00:20:23.642 "name": "BaseBdev1", 00:20:23.642 "uuid": "f14e0b61-4de2-41c0-8f35-14a75237f08a", 00:20:23.642 "is_configured": true, 00:20:23.642 "data_offset": 2048, 00:20:23.642 "data_size": 63488 00:20:23.642 }, 00:20:23.642 { 00:20:23.642 "name": "BaseBdev2", 00:20:23.642 "uuid": "d68b575f-e22d-42c6-86b1-b41d7f503649", 00:20:23.642 "is_configured": true, 00:20:23.642 "data_offset": 2048, 00:20:23.642 "data_size": 63488 00:20:23.642 }, 00:20:23.642 { 00:20:23.642 "name": "BaseBdev3", 00:20:23.642 "uuid": "28ba8dfd-ef84-4ec2-944f-f994f51a294a", 00:20:23.642 "is_configured": true, 00:20:23.642 "data_offset": 2048, 00:20:23.642 "data_size": 63488 00:20:23.642 }, 00:20:23.642 { 00:20:23.642 "name": "BaseBdev4", 00:20:23.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:23.642 "is_configured": false, 00:20:23.642 "data_offset": 0, 00:20:23.642 "data_size": 0 00:20:23.642 } 00:20:23.642 ] 00:20:23.642 }' 00:20:23.642 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:23.642 05:48:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:24.209 05:48:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:24.467 [2024-07-26 05:48:39.123324] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:24.467 [2024-07-26 05:48:39.123491] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fcc350 00:20:24.467 [2024-07-26 05:48:39.123505] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:24.467 [2024-07-26 05:48:39.123687] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fcc020 00:20:24.467 [2024-07-26 05:48:39.123806] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fcc350 00:20:24.467 [2024-07-26 05:48:39.123816] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fcc350 00:20:24.467 [2024-07-26 05:48:39.123905] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:24.467 BaseBdev4 00:20:24.467 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:24.467 05:48:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:24.467 05:48:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:24.467 05:48:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:24.467 05:48:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:24.467 05:48:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:24.467 05:48:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:24.725 [ 00:20:24.725 { 00:20:24.725 "name": "BaseBdev4", 00:20:24.725 "aliases": [ 00:20:24.725 "82c1f86b-dde0-4504-98c9-a09320db2501" 00:20:24.725 ], 00:20:24.725 "product_name": "Malloc disk", 00:20:24.725 "block_size": 512, 00:20:24.725 "num_blocks": 65536, 00:20:24.725 "uuid": "82c1f86b-dde0-4504-98c9-a09320db2501", 00:20:24.725 "assigned_rate_limits": { 00:20:24.725 "rw_ios_per_sec": 0, 00:20:24.725 "rw_mbytes_per_sec": 0, 00:20:24.725 "r_mbytes_per_sec": 0, 00:20:24.725 "w_mbytes_per_sec": 0 00:20:24.725 }, 00:20:24.725 "claimed": true, 00:20:24.725 "claim_type": "exclusive_write", 00:20:24.725 "zoned": false, 00:20:24.725 "supported_io_types": { 00:20:24.725 "read": true, 00:20:24.725 "write": true, 00:20:24.725 "unmap": true, 00:20:24.725 "flush": true, 00:20:24.725 "reset": true, 00:20:24.725 "nvme_admin": false, 00:20:24.725 "nvme_io": false, 00:20:24.725 "nvme_io_md": false, 00:20:24.725 "write_zeroes": true, 00:20:24.725 "zcopy": true, 00:20:24.725 "get_zone_info": false, 00:20:24.725 "zone_management": false, 00:20:24.725 "zone_append": false, 00:20:24.725 "compare": false, 00:20:24.725 "compare_and_write": false, 00:20:24.725 "abort": true, 00:20:24.725 "seek_hole": false, 00:20:24.725 "seek_data": false, 00:20:24.725 "copy": true, 00:20:24.725 "nvme_iov_md": false 00:20:24.725 }, 00:20:24.725 "memory_domains": [ 00:20:24.725 { 00:20:24.725 "dma_device_id": "system", 00:20:24.725 "dma_device_type": 1 00:20:24.725 }, 00:20:24.725 { 00:20:24.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.725 "dma_device_type": 2 00:20:24.725 } 00:20:24.725 ], 00:20:24.725 "driver_specific": {} 00:20:24.725 } 00:20:24.725 ] 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.725 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:24.983 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.983 "name": "Existed_Raid", 00:20:24.983 "uuid": "e91a675d-2251-4a8f-89f2-178f4bc78242", 00:20:24.983 "strip_size_kb": 64, 00:20:24.983 "state": "online", 00:20:24.983 "raid_level": "raid0", 00:20:24.983 "superblock": true, 00:20:24.983 "num_base_bdevs": 4, 00:20:24.983 "num_base_bdevs_discovered": 4, 00:20:24.983 "num_base_bdevs_operational": 4, 00:20:24.983 "base_bdevs_list": [ 00:20:24.983 { 00:20:24.983 "name": "BaseBdev1", 00:20:24.983 "uuid": "f14e0b61-4de2-41c0-8f35-14a75237f08a", 00:20:24.983 "is_configured": true, 00:20:24.983 "data_offset": 2048, 00:20:24.983 "data_size": 63488 00:20:24.983 }, 00:20:24.983 { 00:20:24.983 "name": "BaseBdev2", 00:20:24.983 "uuid": "d68b575f-e22d-42c6-86b1-b41d7f503649", 00:20:24.983 "is_configured": true, 00:20:24.983 "data_offset": 2048, 00:20:24.983 "data_size": 63488 00:20:24.983 }, 00:20:24.983 { 00:20:24.983 "name": "BaseBdev3", 00:20:24.983 "uuid": "28ba8dfd-ef84-4ec2-944f-f994f51a294a", 00:20:24.983 "is_configured": true, 00:20:24.983 "data_offset": 2048, 00:20:24.983 "data_size": 63488 00:20:24.983 }, 00:20:24.983 { 00:20:24.984 "name": "BaseBdev4", 00:20:24.984 "uuid": "82c1f86b-dde0-4504-98c9-a09320db2501", 00:20:24.984 "is_configured": true, 00:20:24.984 "data_offset": 2048, 00:20:24.984 "data_size": 63488 00:20:24.984 } 00:20:24.984 ] 00:20:24.984 }' 00:20:24.984 05:48:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.984 05:48:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:25.918 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:25.918 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:25.918 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:25.918 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:25.918 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:25.918 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:25.918 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:25.918 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:26.176 [2024-07-26 05:48:40.840233] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:26.176 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:26.176 "name": "Existed_Raid", 00:20:26.176 "aliases": [ 00:20:26.176 "e91a675d-2251-4a8f-89f2-178f4bc78242" 00:20:26.176 ], 00:20:26.176 "product_name": "Raid Volume", 00:20:26.176 "block_size": 512, 00:20:26.176 "num_blocks": 253952, 00:20:26.176 "uuid": "e91a675d-2251-4a8f-89f2-178f4bc78242", 00:20:26.176 "assigned_rate_limits": { 00:20:26.176 "rw_ios_per_sec": 0, 00:20:26.176 "rw_mbytes_per_sec": 0, 00:20:26.176 "r_mbytes_per_sec": 0, 00:20:26.176 "w_mbytes_per_sec": 0 00:20:26.176 }, 00:20:26.176 "claimed": false, 00:20:26.176 "zoned": false, 00:20:26.176 "supported_io_types": { 00:20:26.176 "read": true, 00:20:26.176 "write": true, 00:20:26.176 "unmap": true, 00:20:26.176 "flush": true, 00:20:26.176 "reset": true, 00:20:26.176 "nvme_admin": false, 00:20:26.176 "nvme_io": false, 00:20:26.176 "nvme_io_md": false, 00:20:26.176 "write_zeroes": true, 00:20:26.176 "zcopy": false, 00:20:26.176 "get_zone_info": false, 00:20:26.176 "zone_management": false, 00:20:26.176 "zone_append": false, 00:20:26.176 "compare": false, 00:20:26.176 "compare_and_write": false, 00:20:26.176 "abort": false, 00:20:26.176 "seek_hole": false, 00:20:26.176 "seek_data": false, 00:20:26.176 "copy": false, 00:20:26.176 "nvme_iov_md": false 00:20:26.176 }, 00:20:26.176 "memory_domains": [ 00:20:26.176 { 00:20:26.176 "dma_device_id": "system", 00:20:26.176 "dma_device_type": 1 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:26.176 "dma_device_type": 2 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "dma_device_id": "system", 00:20:26.176 "dma_device_type": 1 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:26.176 "dma_device_type": 2 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "dma_device_id": "system", 00:20:26.176 "dma_device_type": 1 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:26.176 "dma_device_type": 2 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "dma_device_id": "system", 00:20:26.176 "dma_device_type": 1 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:26.176 "dma_device_type": 2 00:20:26.176 } 00:20:26.176 ], 00:20:26.176 "driver_specific": { 00:20:26.176 "raid": { 00:20:26.176 "uuid": "e91a675d-2251-4a8f-89f2-178f4bc78242", 00:20:26.176 "strip_size_kb": 64, 00:20:26.176 "state": "online", 00:20:26.176 "raid_level": "raid0", 00:20:26.176 "superblock": true, 00:20:26.176 "num_base_bdevs": 4, 00:20:26.176 "num_base_bdevs_discovered": 4, 00:20:26.176 "num_base_bdevs_operational": 4, 00:20:26.176 "base_bdevs_list": [ 00:20:26.176 { 00:20:26.176 "name": "BaseBdev1", 00:20:26.176 "uuid": "f14e0b61-4de2-41c0-8f35-14a75237f08a", 00:20:26.176 "is_configured": true, 00:20:26.176 "data_offset": 2048, 00:20:26.176 "data_size": 63488 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "name": "BaseBdev2", 00:20:26.176 "uuid": "d68b575f-e22d-42c6-86b1-b41d7f503649", 00:20:26.176 "is_configured": true, 00:20:26.176 "data_offset": 2048, 00:20:26.176 "data_size": 63488 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "name": "BaseBdev3", 00:20:26.176 "uuid": "28ba8dfd-ef84-4ec2-944f-f994f51a294a", 00:20:26.176 "is_configured": true, 00:20:26.176 "data_offset": 2048, 00:20:26.176 "data_size": 63488 00:20:26.176 }, 00:20:26.176 { 00:20:26.176 "name": "BaseBdev4", 00:20:26.176 "uuid": "82c1f86b-dde0-4504-98c9-a09320db2501", 00:20:26.176 "is_configured": true, 00:20:26.176 "data_offset": 2048, 00:20:26.176 "data_size": 63488 00:20:26.176 } 00:20:26.176 ] 00:20:26.176 } 00:20:26.176 } 00:20:26.176 }' 00:20:26.176 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:26.176 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:26.176 BaseBdev2 00:20:26.176 BaseBdev3 00:20:26.176 BaseBdev4' 00:20:26.176 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:26.176 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:26.176 05:48:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:26.434 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:26.434 "name": "BaseBdev1", 00:20:26.434 "aliases": [ 00:20:26.434 "f14e0b61-4de2-41c0-8f35-14a75237f08a" 00:20:26.434 ], 00:20:26.434 "product_name": "Malloc disk", 00:20:26.434 "block_size": 512, 00:20:26.434 "num_blocks": 65536, 00:20:26.434 "uuid": "f14e0b61-4de2-41c0-8f35-14a75237f08a", 00:20:26.434 "assigned_rate_limits": { 00:20:26.434 "rw_ios_per_sec": 0, 00:20:26.434 "rw_mbytes_per_sec": 0, 00:20:26.434 "r_mbytes_per_sec": 0, 00:20:26.434 "w_mbytes_per_sec": 0 00:20:26.434 }, 00:20:26.434 "claimed": true, 00:20:26.434 "claim_type": "exclusive_write", 00:20:26.434 "zoned": false, 00:20:26.434 "supported_io_types": { 00:20:26.434 "read": true, 00:20:26.434 "write": true, 00:20:26.434 "unmap": true, 00:20:26.434 "flush": true, 00:20:26.434 "reset": true, 00:20:26.434 "nvme_admin": false, 00:20:26.434 "nvme_io": false, 00:20:26.434 "nvme_io_md": false, 00:20:26.434 "write_zeroes": true, 00:20:26.434 "zcopy": true, 00:20:26.434 "get_zone_info": false, 00:20:26.434 "zone_management": false, 00:20:26.434 "zone_append": false, 00:20:26.434 "compare": false, 00:20:26.434 "compare_and_write": false, 00:20:26.434 "abort": true, 00:20:26.434 "seek_hole": false, 00:20:26.434 "seek_data": false, 00:20:26.434 "copy": true, 00:20:26.434 "nvme_iov_md": false 00:20:26.434 }, 00:20:26.434 "memory_domains": [ 00:20:26.434 { 00:20:26.434 "dma_device_id": "system", 00:20:26.434 "dma_device_type": 1 00:20:26.434 }, 00:20:26.434 { 00:20:26.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:26.434 "dma_device_type": 2 00:20:26.434 } 00:20:26.434 ], 00:20:26.434 "driver_specific": {} 00:20:26.434 }' 00:20:26.434 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:26.434 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:26.434 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:26.435 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:26.435 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:26.435 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:26.435 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:26.435 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:26.693 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:26.693 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:26.693 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:26.693 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:26.693 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:26.693 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:26.693 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:26.951 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:26.951 "name": "BaseBdev2", 00:20:26.951 "aliases": [ 00:20:26.951 "d68b575f-e22d-42c6-86b1-b41d7f503649" 00:20:26.951 ], 00:20:26.951 "product_name": "Malloc disk", 00:20:26.951 "block_size": 512, 00:20:26.951 "num_blocks": 65536, 00:20:26.951 "uuid": "d68b575f-e22d-42c6-86b1-b41d7f503649", 00:20:26.951 "assigned_rate_limits": { 00:20:26.951 "rw_ios_per_sec": 0, 00:20:26.951 "rw_mbytes_per_sec": 0, 00:20:26.951 "r_mbytes_per_sec": 0, 00:20:26.951 "w_mbytes_per_sec": 0 00:20:26.951 }, 00:20:26.951 "claimed": true, 00:20:26.951 "claim_type": "exclusive_write", 00:20:26.951 "zoned": false, 00:20:26.951 "supported_io_types": { 00:20:26.951 "read": true, 00:20:26.951 "write": true, 00:20:26.951 "unmap": true, 00:20:26.951 "flush": true, 00:20:26.951 "reset": true, 00:20:26.951 "nvme_admin": false, 00:20:26.951 "nvme_io": false, 00:20:26.951 "nvme_io_md": false, 00:20:26.951 "write_zeroes": true, 00:20:26.951 "zcopy": true, 00:20:26.951 "get_zone_info": false, 00:20:26.951 "zone_management": false, 00:20:26.951 "zone_append": false, 00:20:26.951 "compare": false, 00:20:26.951 "compare_and_write": false, 00:20:26.951 "abort": true, 00:20:26.951 "seek_hole": false, 00:20:26.951 "seek_data": false, 00:20:26.951 "copy": true, 00:20:26.951 "nvme_iov_md": false 00:20:26.951 }, 00:20:26.951 "memory_domains": [ 00:20:26.951 { 00:20:26.951 "dma_device_id": "system", 00:20:26.951 "dma_device_type": 1 00:20:26.951 }, 00:20:26.951 { 00:20:26.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:26.951 "dma_device_type": 2 00:20:26.951 } 00:20:26.951 ], 00:20:26.951 "driver_specific": {} 00:20:26.951 }' 00:20:26.951 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:26.951 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:26.951 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:26.951 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:26.951 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:27.209 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:27.209 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:27.209 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:27.209 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:27.209 05:48:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:27.209 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:27.209 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:27.209 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:27.209 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:27.209 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:27.467 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:27.467 "name": "BaseBdev3", 00:20:27.467 "aliases": [ 00:20:27.467 "28ba8dfd-ef84-4ec2-944f-f994f51a294a" 00:20:27.467 ], 00:20:27.467 "product_name": "Malloc disk", 00:20:27.467 "block_size": 512, 00:20:27.467 "num_blocks": 65536, 00:20:27.467 "uuid": "28ba8dfd-ef84-4ec2-944f-f994f51a294a", 00:20:27.467 "assigned_rate_limits": { 00:20:27.467 "rw_ios_per_sec": 0, 00:20:27.467 "rw_mbytes_per_sec": 0, 00:20:27.467 "r_mbytes_per_sec": 0, 00:20:27.467 "w_mbytes_per_sec": 0 00:20:27.467 }, 00:20:27.467 "claimed": true, 00:20:27.467 "claim_type": "exclusive_write", 00:20:27.467 "zoned": false, 00:20:27.467 "supported_io_types": { 00:20:27.467 "read": true, 00:20:27.467 "write": true, 00:20:27.467 "unmap": true, 00:20:27.467 "flush": true, 00:20:27.467 "reset": true, 00:20:27.467 "nvme_admin": false, 00:20:27.467 "nvme_io": false, 00:20:27.467 "nvme_io_md": false, 00:20:27.467 "write_zeroes": true, 00:20:27.467 "zcopy": true, 00:20:27.467 "get_zone_info": false, 00:20:27.467 "zone_management": false, 00:20:27.467 "zone_append": false, 00:20:27.467 "compare": false, 00:20:27.467 "compare_and_write": false, 00:20:27.467 "abort": true, 00:20:27.467 "seek_hole": false, 00:20:27.467 "seek_data": false, 00:20:27.467 "copy": true, 00:20:27.467 "nvme_iov_md": false 00:20:27.467 }, 00:20:27.467 "memory_domains": [ 00:20:27.467 { 00:20:27.467 "dma_device_id": "system", 00:20:27.467 "dma_device_type": 1 00:20:27.467 }, 00:20:27.467 { 00:20:27.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:27.467 "dma_device_type": 2 00:20:27.467 } 00:20:27.467 ], 00:20:27.467 "driver_specific": {} 00:20:27.467 }' 00:20:27.467 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:27.467 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:27.726 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:27.726 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:27.726 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:27.726 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:27.726 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:27.726 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:27.726 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:27.726 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:27.726 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:27.984 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:27.984 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:27.984 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:27.984 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:28.241 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:28.241 "name": "BaseBdev4", 00:20:28.241 "aliases": [ 00:20:28.241 "82c1f86b-dde0-4504-98c9-a09320db2501" 00:20:28.241 ], 00:20:28.241 "product_name": "Malloc disk", 00:20:28.241 "block_size": 512, 00:20:28.241 "num_blocks": 65536, 00:20:28.241 "uuid": "82c1f86b-dde0-4504-98c9-a09320db2501", 00:20:28.241 "assigned_rate_limits": { 00:20:28.241 "rw_ios_per_sec": 0, 00:20:28.241 "rw_mbytes_per_sec": 0, 00:20:28.241 "r_mbytes_per_sec": 0, 00:20:28.241 "w_mbytes_per_sec": 0 00:20:28.241 }, 00:20:28.241 "claimed": true, 00:20:28.242 "claim_type": "exclusive_write", 00:20:28.242 "zoned": false, 00:20:28.242 "supported_io_types": { 00:20:28.242 "read": true, 00:20:28.242 "write": true, 00:20:28.242 "unmap": true, 00:20:28.242 "flush": true, 00:20:28.242 "reset": true, 00:20:28.242 "nvme_admin": false, 00:20:28.242 "nvme_io": false, 00:20:28.242 "nvme_io_md": false, 00:20:28.242 "write_zeroes": true, 00:20:28.242 "zcopy": true, 00:20:28.242 "get_zone_info": false, 00:20:28.242 "zone_management": false, 00:20:28.242 "zone_append": false, 00:20:28.242 "compare": false, 00:20:28.242 "compare_and_write": false, 00:20:28.242 "abort": true, 00:20:28.242 "seek_hole": false, 00:20:28.242 "seek_data": false, 00:20:28.242 "copy": true, 00:20:28.242 "nvme_iov_md": false 00:20:28.242 }, 00:20:28.242 "memory_domains": [ 00:20:28.242 { 00:20:28.242 "dma_device_id": "system", 00:20:28.242 "dma_device_type": 1 00:20:28.242 }, 00:20:28.242 { 00:20:28.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.242 "dma_device_type": 2 00:20:28.242 } 00:20:28.242 ], 00:20:28.242 "driver_specific": {} 00:20:28.242 }' 00:20:28.242 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:28.242 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:28.242 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:28.242 05:48:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:28.242 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:28.242 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:28.242 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:28.242 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:28.242 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:28.242 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:28.499 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:28.499 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:28.500 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:28.757 [2024-07-26 05:48:43.442875] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:28.757 [2024-07-26 05:48:43.442902] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:28.757 [2024-07-26 05:48:43.442949] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.757 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.014 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.014 "name": "Existed_Raid", 00:20:29.014 "uuid": "e91a675d-2251-4a8f-89f2-178f4bc78242", 00:20:29.014 "strip_size_kb": 64, 00:20:29.014 "state": "offline", 00:20:29.014 "raid_level": "raid0", 00:20:29.014 "superblock": true, 00:20:29.014 "num_base_bdevs": 4, 00:20:29.014 "num_base_bdevs_discovered": 3, 00:20:29.014 "num_base_bdevs_operational": 3, 00:20:29.014 "base_bdevs_list": [ 00:20:29.014 { 00:20:29.014 "name": null, 00:20:29.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.014 "is_configured": false, 00:20:29.014 "data_offset": 2048, 00:20:29.014 "data_size": 63488 00:20:29.014 }, 00:20:29.014 { 00:20:29.014 "name": "BaseBdev2", 00:20:29.014 "uuid": "d68b575f-e22d-42c6-86b1-b41d7f503649", 00:20:29.014 "is_configured": true, 00:20:29.014 "data_offset": 2048, 00:20:29.014 "data_size": 63488 00:20:29.014 }, 00:20:29.014 { 00:20:29.014 "name": "BaseBdev3", 00:20:29.014 "uuid": "28ba8dfd-ef84-4ec2-944f-f994f51a294a", 00:20:29.014 "is_configured": true, 00:20:29.014 "data_offset": 2048, 00:20:29.014 "data_size": 63488 00:20:29.014 }, 00:20:29.014 { 00:20:29.014 "name": "BaseBdev4", 00:20:29.014 "uuid": "82c1f86b-dde0-4504-98c9-a09320db2501", 00:20:29.014 "is_configured": true, 00:20:29.014 "data_offset": 2048, 00:20:29.014 "data_size": 63488 00:20:29.014 } 00:20:29.014 ] 00:20:29.014 }' 00:20:29.014 05:48:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.014 05:48:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:29.578 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:29.578 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:29.578 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.578 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:29.836 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:29.836 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:29.836 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:30.093 [2024-07-26 05:48:44.751429] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:30.093 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:30.093 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:30.093 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.093 05:48:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:30.350 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:30.350 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:30.350 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:30.608 [2024-07-26 05:48:45.507959] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:30.865 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:30.865 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:30.865 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.865 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:31.125 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:31.125 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:31.125 05:48:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:31.125 [2024-07-26 05:48:46.029710] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:31.125 [2024-07-26 05:48:46.029754] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fcc350 name Existed_Raid, state offline 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:31.443 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:31.983 BaseBdev2 00:20:31.983 05:48:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:31.983 05:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:31.983 05:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:31.984 05:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:31.984 05:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:31.984 05:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:31.984 05:48:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:32.243 05:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:32.809 [ 00:20:32.809 { 00:20:32.809 "name": "BaseBdev2", 00:20:32.809 "aliases": [ 00:20:32.809 "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130" 00:20:32.809 ], 00:20:32.809 "product_name": "Malloc disk", 00:20:32.809 "block_size": 512, 00:20:32.809 "num_blocks": 65536, 00:20:32.809 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:32.809 "assigned_rate_limits": { 00:20:32.809 "rw_ios_per_sec": 0, 00:20:32.809 "rw_mbytes_per_sec": 0, 00:20:32.809 "r_mbytes_per_sec": 0, 00:20:32.809 "w_mbytes_per_sec": 0 00:20:32.809 }, 00:20:32.809 "claimed": false, 00:20:32.809 "zoned": false, 00:20:32.809 "supported_io_types": { 00:20:32.809 "read": true, 00:20:32.809 "write": true, 00:20:32.809 "unmap": true, 00:20:32.809 "flush": true, 00:20:32.809 "reset": true, 00:20:32.809 "nvme_admin": false, 00:20:32.809 "nvme_io": false, 00:20:32.809 "nvme_io_md": false, 00:20:32.809 "write_zeroes": true, 00:20:32.809 "zcopy": true, 00:20:32.809 "get_zone_info": false, 00:20:32.809 "zone_management": false, 00:20:32.809 "zone_append": false, 00:20:32.809 "compare": false, 00:20:32.809 "compare_and_write": false, 00:20:32.809 "abort": true, 00:20:32.809 "seek_hole": false, 00:20:32.809 "seek_data": false, 00:20:32.809 "copy": true, 00:20:32.810 "nvme_iov_md": false 00:20:32.810 }, 00:20:32.810 "memory_domains": [ 00:20:32.810 { 00:20:32.810 "dma_device_id": "system", 00:20:32.810 "dma_device_type": 1 00:20:32.810 }, 00:20:32.810 { 00:20:32.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.810 "dma_device_type": 2 00:20:32.810 } 00:20:32.810 ], 00:20:32.810 "driver_specific": {} 00:20:32.810 } 00:20:32.810 ] 00:20:32.810 05:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:32.810 05:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:32.810 05:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:32.810 05:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:33.068 BaseBdev3 00:20:33.068 05:48:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:33.068 05:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:33.068 05:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:33.068 05:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:33.068 05:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:33.068 05:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:33.068 05:48:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:33.635 05:48:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:33.893 [ 00:20:33.893 { 00:20:33.893 "name": "BaseBdev3", 00:20:33.893 "aliases": [ 00:20:33.893 "9c402a8c-79df-4e5a-ad62-8807b94667c9" 00:20:33.893 ], 00:20:33.893 "product_name": "Malloc disk", 00:20:33.893 "block_size": 512, 00:20:33.893 "num_blocks": 65536, 00:20:33.893 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:33.893 "assigned_rate_limits": { 00:20:33.893 "rw_ios_per_sec": 0, 00:20:33.893 "rw_mbytes_per_sec": 0, 00:20:33.893 "r_mbytes_per_sec": 0, 00:20:33.893 "w_mbytes_per_sec": 0 00:20:33.893 }, 00:20:33.893 "claimed": false, 00:20:33.893 "zoned": false, 00:20:33.893 "supported_io_types": { 00:20:33.893 "read": true, 00:20:33.893 "write": true, 00:20:33.893 "unmap": true, 00:20:33.893 "flush": true, 00:20:33.893 "reset": true, 00:20:33.893 "nvme_admin": false, 00:20:33.893 "nvme_io": false, 00:20:33.893 "nvme_io_md": false, 00:20:33.893 "write_zeroes": true, 00:20:33.893 "zcopy": true, 00:20:33.893 "get_zone_info": false, 00:20:33.893 "zone_management": false, 00:20:33.893 "zone_append": false, 00:20:33.893 "compare": false, 00:20:33.893 "compare_and_write": false, 00:20:33.893 "abort": true, 00:20:33.893 "seek_hole": false, 00:20:33.893 "seek_data": false, 00:20:33.893 "copy": true, 00:20:33.893 "nvme_iov_md": false 00:20:33.893 }, 00:20:33.893 "memory_domains": [ 00:20:33.893 { 00:20:33.893 "dma_device_id": "system", 00:20:33.893 "dma_device_type": 1 00:20:33.893 }, 00:20:33.893 { 00:20:33.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:33.893 "dma_device_type": 2 00:20:33.893 } 00:20:33.893 ], 00:20:33.893 "driver_specific": {} 00:20:33.893 } 00:20:33.893 ] 00:20:33.893 05:48:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:33.893 05:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:33.893 05:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:33.893 05:48:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:34.461 BaseBdev4 00:20:34.461 05:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:34.461 05:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:34.461 05:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:34.461 05:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:34.461 05:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:34.461 05:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:34.461 05:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:34.461 05:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:35.030 [ 00:20:35.030 { 00:20:35.030 "name": "BaseBdev4", 00:20:35.030 "aliases": [ 00:20:35.030 "728b2cbc-fe16-4d30-9ebb-966c4e275069" 00:20:35.030 ], 00:20:35.030 "product_name": "Malloc disk", 00:20:35.030 "block_size": 512, 00:20:35.030 "num_blocks": 65536, 00:20:35.030 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:35.030 "assigned_rate_limits": { 00:20:35.030 "rw_ios_per_sec": 0, 00:20:35.030 "rw_mbytes_per_sec": 0, 00:20:35.030 "r_mbytes_per_sec": 0, 00:20:35.030 "w_mbytes_per_sec": 0 00:20:35.030 }, 00:20:35.030 "claimed": false, 00:20:35.030 "zoned": false, 00:20:35.030 "supported_io_types": { 00:20:35.030 "read": true, 00:20:35.030 "write": true, 00:20:35.030 "unmap": true, 00:20:35.030 "flush": true, 00:20:35.030 "reset": true, 00:20:35.030 "nvme_admin": false, 00:20:35.030 "nvme_io": false, 00:20:35.030 "nvme_io_md": false, 00:20:35.030 "write_zeroes": true, 00:20:35.030 "zcopy": true, 00:20:35.030 "get_zone_info": false, 00:20:35.030 "zone_management": false, 00:20:35.030 "zone_append": false, 00:20:35.030 "compare": false, 00:20:35.030 "compare_and_write": false, 00:20:35.030 "abort": true, 00:20:35.030 "seek_hole": false, 00:20:35.030 "seek_data": false, 00:20:35.030 "copy": true, 00:20:35.030 "nvme_iov_md": false 00:20:35.030 }, 00:20:35.030 "memory_domains": [ 00:20:35.030 { 00:20:35.030 "dma_device_id": "system", 00:20:35.030 "dma_device_type": 1 00:20:35.030 }, 00:20:35.030 { 00:20:35.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.030 "dma_device_type": 2 00:20:35.030 } 00:20:35.030 ], 00:20:35.030 "driver_specific": {} 00:20:35.030 } 00:20:35.030 ] 00:20:35.030 05:48:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:35.030 05:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:35.030 05:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:35.030 05:48:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:35.290 [2024-07-26 05:48:50.088676] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:35.290 [2024-07-26 05:48:50.088718] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:35.290 [2024-07-26 05:48:50.088737] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:35.290 [2024-07-26 05:48:50.090059] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:35.290 [2024-07-26 05:48:50.090100] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.290 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:35.549 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:35.549 "name": "Existed_Raid", 00:20:35.549 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:35.549 "strip_size_kb": 64, 00:20:35.549 "state": "configuring", 00:20:35.549 "raid_level": "raid0", 00:20:35.549 "superblock": true, 00:20:35.549 "num_base_bdevs": 4, 00:20:35.549 "num_base_bdevs_discovered": 3, 00:20:35.549 "num_base_bdevs_operational": 4, 00:20:35.549 "base_bdevs_list": [ 00:20:35.549 { 00:20:35.549 "name": "BaseBdev1", 00:20:35.549 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:35.549 "is_configured": false, 00:20:35.549 "data_offset": 0, 00:20:35.549 "data_size": 0 00:20:35.549 }, 00:20:35.549 { 00:20:35.549 "name": "BaseBdev2", 00:20:35.549 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:35.549 "is_configured": true, 00:20:35.549 "data_offset": 2048, 00:20:35.549 "data_size": 63488 00:20:35.549 }, 00:20:35.549 { 00:20:35.549 "name": "BaseBdev3", 00:20:35.549 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:35.549 "is_configured": true, 00:20:35.549 "data_offset": 2048, 00:20:35.549 "data_size": 63488 00:20:35.549 }, 00:20:35.549 { 00:20:35.549 "name": "BaseBdev4", 00:20:35.549 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:35.549 "is_configured": true, 00:20:35.549 "data_offset": 2048, 00:20:35.549 "data_size": 63488 00:20:35.549 } 00:20:35.549 ] 00:20:35.549 }' 00:20:35.549 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:35.549 05:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:36.117 05:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:36.376 [2024-07-26 05:48:51.115348] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.376 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:36.635 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.635 "name": "Existed_Raid", 00:20:36.635 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:36.635 "strip_size_kb": 64, 00:20:36.635 "state": "configuring", 00:20:36.635 "raid_level": "raid0", 00:20:36.635 "superblock": true, 00:20:36.635 "num_base_bdevs": 4, 00:20:36.635 "num_base_bdevs_discovered": 2, 00:20:36.635 "num_base_bdevs_operational": 4, 00:20:36.635 "base_bdevs_list": [ 00:20:36.635 { 00:20:36.635 "name": "BaseBdev1", 00:20:36.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.635 "is_configured": false, 00:20:36.635 "data_offset": 0, 00:20:36.635 "data_size": 0 00:20:36.635 }, 00:20:36.635 { 00:20:36.635 "name": null, 00:20:36.635 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:36.635 "is_configured": false, 00:20:36.635 "data_offset": 2048, 00:20:36.635 "data_size": 63488 00:20:36.635 }, 00:20:36.635 { 00:20:36.635 "name": "BaseBdev3", 00:20:36.635 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:36.635 "is_configured": true, 00:20:36.635 "data_offset": 2048, 00:20:36.635 "data_size": 63488 00:20:36.635 }, 00:20:36.635 { 00:20:36.635 "name": "BaseBdev4", 00:20:36.635 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:36.635 "is_configured": true, 00:20:36.635 "data_offset": 2048, 00:20:36.635 "data_size": 63488 00:20:36.635 } 00:20:36.635 ] 00:20:36.635 }' 00:20:36.635 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.635 05:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:37.202 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.202 05:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:37.202 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:37.202 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:37.461 [2024-07-26 05:48:52.309890] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:37.461 BaseBdev1 00:20:37.461 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:37.461 05:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:37.461 05:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:37.461 05:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:37.461 05:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:37.461 05:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:37.461 05:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:37.719 05:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:37.978 [ 00:20:37.978 { 00:20:37.978 "name": "BaseBdev1", 00:20:37.978 "aliases": [ 00:20:37.978 "883a9dcc-5911-4e0c-b935-d4b1da16e1de" 00:20:37.978 ], 00:20:37.978 "product_name": "Malloc disk", 00:20:37.978 "block_size": 512, 00:20:37.978 "num_blocks": 65536, 00:20:37.978 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:37.978 "assigned_rate_limits": { 00:20:37.978 "rw_ios_per_sec": 0, 00:20:37.978 "rw_mbytes_per_sec": 0, 00:20:37.978 "r_mbytes_per_sec": 0, 00:20:37.978 "w_mbytes_per_sec": 0 00:20:37.978 }, 00:20:37.978 "claimed": true, 00:20:37.978 "claim_type": "exclusive_write", 00:20:37.978 "zoned": false, 00:20:37.978 "supported_io_types": { 00:20:37.978 "read": true, 00:20:37.978 "write": true, 00:20:37.978 "unmap": true, 00:20:37.978 "flush": true, 00:20:37.978 "reset": true, 00:20:37.978 "nvme_admin": false, 00:20:37.978 "nvme_io": false, 00:20:37.978 "nvme_io_md": false, 00:20:37.978 "write_zeroes": true, 00:20:37.978 "zcopy": true, 00:20:37.978 "get_zone_info": false, 00:20:37.978 "zone_management": false, 00:20:37.978 "zone_append": false, 00:20:37.978 "compare": false, 00:20:37.978 "compare_and_write": false, 00:20:37.978 "abort": true, 00:20:37.978 "seek_hole": false, 00:20:37.978 "seek_data": false, 00:20:37.978 "copy": true, 00:20:37.978 "nvme_iov_md": false 00:20:37.978 }, 00:20:37.978 "memory_domains": [ 00:20:37.978 { 00:20:37.978 "dma_device_id": "system", 00:20:37.978 "dma_device_type": 1 00:20:37.978 }, 00:20:37.978 { 00:20:37.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:37.978 "dma_device_type": 2 00:20:37.978 } 00:20:37.978 ], 00:20:37.978 "driver_specific": {} 00:20:37.978 } 00:20:37.978 ] 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.978 05:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:38.236 05:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:38.236 "name": "Existed_Raid", 00:20:38.236 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:38.236 "strip_size_kb": 64, 00:20:38.236 "state": "configuring", 00:20:38.236 "raid_level": "raid0", 00:20:38.236 "superblock": true, 00:20:38.237 "num_base_bdevs": 4, 00:20:38.237 "num_base_bdevs_discovered": 3, 00:20:38.237 "num_base_bdevs_operational": 4, 00:20:38.237 "base_bdevs_list": [ 00:20:38.237 { 00:20:38.237 "name": "BaseBdev1", 00:20:38.237 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:38.237 "is_configured": true, 00:20:38.237 "data_offset": 2048, 00:20:38.237 "data_size": 63488 00:20:38.237 }, 00:20:38.237 { 00:20:38.237 "name": null, 00:20:38.237 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:38.237 "is_configured": false, 00:20:38.237 "data_offset": 2048, 00:20:38.237 "data_size": 63488 00:20:38.237 }, 00:20:38.237 { 00:20:38.237 "name": "BaseBdev3", 00:20:38.237 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:38.237 "is_configured": true, 00:20:38.237 "data_offset": 2048, 00:20:38.237 "data_size": 63488 00:20:38.237 }, 00:20:38.237 { 00:20:38.237 "name": "BaseBdev4", 00:20:38.237 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:38.237 "is_configured": true, 00:20:38.237 "data_offset": 2048, 00:20:38.237 "data_size": 63488 00:20:38.237 } 00:20:38.237 ] 00:20:38.237 }' 00:20:38.237 05:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:38.237 05:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:38.804 05:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.804 05:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:39.062 05:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:39.062 05:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:39.630 [2024-07-26 05:48:54.395452] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:39.630 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:39.630 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:39.630 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:39.630 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:39.630 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:39.630 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:39.630 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.631 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.631 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.631 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.631 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.631 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:39.890 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.890 "name": "Existed_Raid", 00:20:39.890 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:39.890 "strip_size_kb": 64, 00:20:39.890 "state": "configuring", 00:20:39.890 "raid_level": "raid0", 00:20:39.890 "superblock": true, 00:20:39.890 "num_base_bdevs": 4, 00:20:39.890 "num_base_bdevs_discovered": 2, 00:20:39.890 "num_base_bdevs_operational": 4, 00:20:39.890 "base_bdevs_list": [ 00:20:39.890 { 00:20:39.890 "name": "BaseBdev1", 00:20:39.890 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:39.890 "is_configured": true, 00:20:39.890 "data_offset": 2048, 00:20:39.890 "data_size": 63488 00:20:39.890 }, 00:20:39.890 { 00:20:39.890 "name": null, 00:20:39.890 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:39.890 "is_configured": false, 00:20:39.890 "data_offset": 2048, 00:20:39.890 "data_size": 63488 00:20:39.890 }, 00:20:39.890 { 00:20:39.890 "name": null, 00:20:39.890 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:39.890 "is_configured": false, 00:20:39.890 "data_offset": 2048, 00:20:39.890 "data_size": 63488 00:20:39.890 }, 00:20:39.890 { 00:20:39.890 "name": "BaseBdev4", 00:20:39.890 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:39.890 "is_configured": true, 00:20:39.890 "data_offset": 2048, 00:20:39.890 "data_size": 63488 00:20:39.890 } 00:20:39.890 ] 00:20:39.890 }' 00:20:39.890 05:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.890 05:48:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:40.458 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.458 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:40.717 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:40.717 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:40.975 [2024-07-26 05:48:55.718968] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:40.975 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:40.975 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:40.975 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:40.975 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:40.975 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:40.976 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:40.976 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.976 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.976 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.976 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.976 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.976 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:41.235 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:41.235 "name": "Existed_Raid", 00:20:41.235 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:41.235 "strip_size_kb": 64, 00:20:41.235 "state": "configuring", 00:20:41.235 "raid_level": "raid0", 00:20:41.235 "superblock": true, 00:20:41.235 "num_base_bdevs": 4, 00:20:41.235 "num_base_bdevs_discovered": 3, 00:20:41.235 "num_base_bdevs_operational": 4, 00:20:41.235 "base_bdevs_list": [ 00:20:41.235 { 00:20:41.235 "name": "BaseBdev1", 00:20:41.235 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:41.235 "is_configured": true, 00:20:41.235 "data_offset": 2048, 00:20:41.235 "data_size": 63488 00:20:41.235 }, 00:20:41.235 { 00:20:41.235 "name": null, 00:20:41.235 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:41.235 "is_configured": false, 00:20:41.235 "data_offset": 2048, 00:20:41.235 "data_size": 63488 00:20:41.235 }, 00:20:41.235 { 00:20:41.235 "name": "BaseBdev3", 00:20:41.235 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:41.235 "is_configured": true, 00:20:41.235 "data_offset": 2048, 00:20:41.235 "data_size": 63488 00:20:41.235 }, 00:20:41.235 { 00:20:41.235 "name": "BaseBdev4", 00:20:41.235 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:41.235 "is_configured": true, 00:20:41.235 "data_offset": 2048, 00:20:41.235 "data_size": 63488 00:20:41.235 } 00:20:41.235 ] 00:20:41.235 }' 00:20:41.235 05:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:41.235 05:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:41.802 05:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.802 05:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:42.061 05:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:42.061 05:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:42.629 [2024-07-26 05:48:57.315206] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.629 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:42.888 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.888 "name": "Existed_Raid", 00:20:42.888 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:42.888 "strip_size_kb": 64, 00:20:42.888 "state": "configuring", 00:20:42.888 "raid_level": "raid0", 00:20:42.888 "superblock": true, 00:20:42.888 "num_base_bdevs": 4, 00:20:42.888 "num_base_bdevs_discovered": 2, 00:20:42.888 "num_base_bdevs_operational": 4, 00:20:42.888 "base_bdevs_list": [ 00:20:42.888 { 00:20:42.888 "name": null, 00:20:42.888 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:42.888 "is_configured": false, 00:20:42.888 "data_offset": 2048, 00:20:42.888 "data_size": 63488 00:20:42.888 }, 00:20:42.888 { 00:20:42.888 "name": null, 00:20:42.888 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:42.888 "is_configured": false, 00:20:42.888 "data_offset": 2048, 00:20:42.888 "data_size": 63488 00:20:42.888 }, 00:20:42.888 { 00:20:42.888 "name": "BaseBdev3", 00:20:42.888 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:42.888 "is_configured": true, 00:20:42.888 "data_offset": 2048, 00:20:42.888 "data_size": 63488 00:20:42.888 }, 00:20:42.888 { 00:20:42.888 "name": "BaseBdev4", 00:20:42.888 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:42.888 "is_configured": true, 00:20:42.888 "data_offset": 2048, 00:20:42.888 "data_size": 63488 00:20:42.888 } 00:20:42.888 ] 00:20:42.888 }' 00:20:42.888 05:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.888 05:48:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:43.456 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.456 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:43.715 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:43.715 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:43.974 [2024-07-26 05:48:58.670066] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.974 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:44.234 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.234 "name": "Existed_Raid", 00:20:44.234 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:44.234 "strip_size_kb": 64, 00:20:44.234 "state": "configuring", 00:20:44.234 "raid_level": "raid0", 00:20:44.234 "superblock": true, 00:20:44.234 "num_base_bdevs": 4, 00:20:44.234 "num_base_bdevs_discovered": 3, 00:20:44.234 "num_base_bdevs_operational": 4, 00:20:44.234 "base_bdevs_list": [ 00:20:44.234 { 00:20:44.234 "name": null, 00:20:44.234 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:44.234 "is_configured": false, 00:20:44.234 "data_offset": 2048, 00:20:44.234 "data_size": 63488 00:20:44.234 }, 00:20:44.234 { 00:20:44.234 "name": "BaseBdev2", 00:20:44.234 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:44.234 "is_configured": true, 00:20:44.234 "data_offset": 2048, 00:20:44.234 "data_size": 63488 00:20:44.234 }, 00:20:44.234 { 00:20:44.234 "name": "BaseBdev3", 00:20:44.234 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:44.234 "is_configured": true, 00:20:44.234 "data_offset": 2048, 00:20:44.234 "data_size": 63488 00:20:44.234 }, 00:20:44.234 { 00:20:44.234 "name": "BaseBdev4", 00:20:44.234 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:44.234 "is_configured": true, 00:20:44.234 "data_offset": 2048, 00:20:44.234 "data_size": 63488 00:20:44.234 } 00:20:44.234 ] 00:20:44.234 }' 00:20:44.234 05:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.234 05:48:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:44.801 05:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.801 05:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:45.060 05:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:45.060 05:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.060 05:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:45.318 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 883a9dcc-5911-4e0c-b935-d4b1da16e1de 00:20:45.577 [2024-07-26 05:49:00.262847] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:45.577 [2024-07-26 05:49:00.263007] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fd2470 00:20:45.577 [2024-07-26 05:49:00.263020] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:45.577 [2024-07-26 05:49:00.263195] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fc2c40 00:20:45.577 [2024-07-26 05:49:00.263309] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fd2470 00:20:45.577 [2024-07-26 05:49:00.263319] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fd2470 00:20:45.577 [2024-07-26 05:49:00.263409] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:45.577 NewBaseBdev 00:20:45.577 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:45.577 05:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:45.577 05:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:45.577 05:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:45.577 05:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:45.577 05:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:45.577 05:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:45.835 05:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:46.093 [ 00:20:46.093 { 00:20:46.093 "name": "NewBaseBdev", 00:20:46.093 "aliases": [ 00:20:46.093 "883a9dcc-5911-4e0c-b935-d4b1da16e1de" 00:20:46.093 ], 00:20:46.093 "product_name": "Malloc disk", 00:20:46.093 "block_size": 512, 00:20:46.093 "num_blocks": 65536, 00:20:46.093 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:46.093 "assigned_rate_limits": { 00:20:46.093 "rw_ios_per_sec": 0, 00:20:46.093 "rw_mbytes_per_sec": 0, 00:20:46.093 "r_mbytes_per_sec": 0, 00:20:46.093 "w_mbytes_per_sec": 0 00:20:46.093 }, 00:20:46.093 "claimed": true, 00:20:46.093 "claim_type": "exclusive_write", 00:20:46.093 "zoned": false, 00:20:46.093 "supported_io_types": { 00:20:46.093 "read": true, 00:20:46.093 "write": true, 00:20:46.093 "unmap": true, 00:20:46.093 "flush": true, 00:20:46.093 "reset": true, 00:20:46.093 "nvme_admin": false, 00:20:46.093 "nvme_io": false, 00:20:46.093 "nvme_io_md": false, 00:20:46.093 "write_zeroes": true, 00:20:46.093 "zcopy": true, 00:20:46.093 "get_zone_info": false, 00:20:46.093 "zone_management": false, 00:20:46.093 "zone_append": false, 00:20:46.093 "compare": false, 00:20:46.093 "compare_and_write": false, 00:20:46.093 "abort": true, 00:20:46.093 "seek_hole": false, 00:20:46.093 "seek_data": false, 00:20:46.093 "copy": true, 00:20:46.093 "nvme_iov_md": false 00:20:46.093 }, 00:20:46.093 "memory_domains": [ 00:20:46.093 { 00:20:46.093 "dma_device_id": "system", 00:20:46.093 "dma_device_type": 1 00:20:46.094 }, 00:20:46.094 { 00:20:46.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.094 "dma_device_type": 2 00:20:46.094 } 00:20:46.094 ], 00:20:46.094 "driver_specific": {} 00:20:46.094 } 00:20:46.094 ] 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.094 05:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:46.352 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:46.352 "name": "Existed_Raid", 00:20:46.352 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:46.352 "strip_size_kb": 64, 00:20:46.352 "state": "online", 00:20:46.352 "raid_level": "raid0", 00:20:46.352 "superblock": true, 00:20:46.352 "num_base_bdevs": 4, 00:20:46.352 "num_base_bdevs_discovered": 4, 00:20:46.352 "num_base_bdevs_operational": 4, 00:20:46.352 "base_bdevs_list": [ 00:20:46.352 { 00:20:46.352 "name": "NewBaseBdev", 00:20:46.352 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:46.352 "is_configured": true, 00:20:46.352 "data_offset": 2048, 00:20:46.352 "data_size": 63488 00:20:46.352 }, 00:20:46.352 { 00:20:46.352 "name": "BaseBdev2", 00:20:46.352 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:46.352 "is_configured": true, 00:20:46.352 "data_offset": 2048, 00:20:46.352 "data_size": 63488 00:20:46.352 }, 00:20:46.352 { 00:20:46.352 "name": "BaseBdev3", 00:20:46.352 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:46.352 "is_configured": true, 00:20:46.352 "data_offset": 2048, 00:20:46.352 "data_size": 63488 00:20:46.352 }, 00:20:46.352 { 00:20:46.352 "name": "BaseBdev4", 00:20:46.352 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:46.352 "is_configured": true, 00:20:46.352 "data_offset": 2048, 00:20:46.352 "data_size": 63488 00:20:46.352 } 00:20:46.352 ] 00:20:46.352 }' 00:20:46.352 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:46.352 05:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:46.980 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:46.980 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:46.980 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:46.980 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:46.980 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:46.980 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:46.980 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:46.980 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:46.980 [2024-07-26 05:49:01.847370] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:46.980 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:46.980 "name": "Existed_Raid", 00:20:46.980 "aliases": [ 00:20:46.980 "fa2e354a-763b-4e82-af82-1a6af57051ed" 00:20:46.980 ], 00:20:46.980 "product_name": "Raid Volume", 00:20:46.980 "block_size": 512, 00:20:46.980 "num_blocks": 253952, 00:20:46.980 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:46.980 "assigned_rate_limits": { 00:20:46.980 "rw_ios_per_sec": 0, 00:20:46.980 "rw_mbytes_per_sec": 0, 00:20:46.980 "r_mbytes_per_sec": 0, 00:20:46.980 "w_mbytes_per_sec": 0 00:20:46.980 }, 00:20:46.981 "claimed": false, 00:20:46.981 "zoned": false, 00:20:46.981 "supported_io_types": { 00:20:46.981 "read": true, 00:20:46.981 "write": true, 00:20:46.981 "unmap": true, 00:20:46.981 "flush": true, 00:20:46.981 "reset": true, 00:20:46.981 "nvme_admin": false, 00:20:46.981 "nvme_io": false, 00:20:46.981 "nvme_io_md": false, 00:20:46.981 "write_zeroes": true, 00:20:46.981 "zcopy": false, 00:20:46.981 "get_zone_info": false, 00:20:46.981 "zone_management": false, 00:20:46.981 "zone_append": false, 00:20:46.981 "compare": false, 00:20:46.981 "compare_and_write": false, 00:20:46.981 "abort": false, 00:20:46.981 "seek_hole": false, 00:20:46.981 "seek_data": false, 00:20:46.981 "copy": false, 00:20:46.981 "nvme_iov_md": false 00:20:46.981 }, 00:20:46.981 "memory_domains": [ 00:20:46.981 { 00:20:46.981 "dma_device_id": "system", 00:20:46.981 "dma_device_type": 1 00:20:46.981 }, 00:20:46.981 { 00:20:46.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.981 "dma_device_type": 2 00:20:46.981 }, 00:20:46.981 { 00:20:46.981 "dma_device_id": "system", 00:20:46.981 "dma_device_type": 1 00:20:46.981 }, 00:20:46.981 { 00:20:46.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.981 "dma_device_type": 2 00:20:46.981 }, 00:20:46.982 { 00:20:46.982 "dma_device_id": "system", 00:20:46.982 "dma_device_type": 1 00:20:46.982 }, 00:20:46.982 { 00:20:46.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.982 "dma_device_type": 2 00:20:46.982 }, 00:20:46.982 { 00:20:46.982 "dma_device_id": "system", 00:20:46.982 "dma_device_type": 1 00:20:46.982 }, 00:20:46.982 { 00:20:46.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.982 "dma_device_type": 2 00:20:46.982 } 00:20:46.982 ], 00:20:46.982 "driver_specific": { 00:20:46.982 "raid": { 00:20:46.982 "uuid": "fa2e354a-763b-4e82-af82-1a6af57051ed", 00:20:46.982 "strip_size_kb": 64, 00:20:46.982 "state": "online", 00:20:46.982 "raid_level": "raid0", 00:20:46.982 "superblock": true, 00:20:46.982 "num_base_bdevs": 4, 00:20:46.982 "num_base_bdevs_discovered": 4, 00:20:46.982 "num_base_bdevs_operational": 4, 00:20:46.982 "base_bdevs_list": [ 00:20:46.982 { 00:20:46.982 "name": "NewBaseBdev", 00:20:46.982 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:46.982 "is_configured": true, 00:20:46.982 "data_offset": 2048, 00:20:46.982 "data_size": 63488 00:20:46.982 }, 00:20:46.982 { 00:20:46.982 "name": "BaseBdev2", 00:20:46.982 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:46.982 "is_configured": true, 00:20:46.982 "data_offset": 2048, 00:20:46.982 "data_size": 63488 00:20:46.982 }, 00:20:46.982 { 00:20:46.982 "name": "BaseBdev3", 00:20:46.982 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:46.983 "is_configured": true, 00:20:46.983 "data_offset": 2048, 00:20:46.983 "data_size": 63488 00:20:46.983 }, 00:20:46.983 { 00:20:46.983 "name": "BaseBdev4", 00:20:46.983 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:46.983 "is_configured": true, 00:20:46.983 "data_offset": 2048, 00:20:46.983 "data_size": 63488 00:20:46.983 } 00:20:46.983 ] 00:20:46.983 } 00:20:46.983 } 00:20:46.983 }' 00:20:46.983 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:47.247 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:47.247 BaseBdev2 00:20:47.247 BaseBdev3 00:20:47.247 BaseBdev4' 00:20:47.247 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:47.247 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:47.247 05:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:47.247 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:47.247 "name": "NewBaseBdev", 00:20:47.247 "aliases": [ 00:20:47.247 "883a9dcc-5911-4e0c-b935-d4b1da16e1de" 00:20:47.247 ], 00:20:47.247 "product_name": "Malloc disk", 00:20:47.247 "block_size": 512, 00:20:47.247 "num_blocks": 65536, 00:20:47.247 "uuid": "883a9dcc-5911-4e0c-b935-d4b1da16e1de", 00:20:47.247 "assigned_rate_limits": { 00:20:47.247 "rw_ios_per_sec": 0, 00:20:47.248 "rw_mbytes_per_sec": 0, 00:20:47.248 "r_mbytes_per_sec": 0, 00:20:47.248 "w_mbytes_per_sec": 0 00:20:47.248 }, 00:20:47.248 "claimed": true, 00:20:47.248 "claim_type": "exclusive_write", 00:20:47.248 "zoned": false, 00:20:47.248 "supported_io_types": { 00:20:47.248 "read": true, 00:20:47.248 "write": true, 00:20:47.248 "unmap": true, 00:20:47.248 "flush": true, 00:20:47.248 "reset": true, 00:20:47.248 "nvme_admin": false, 00:20:47.248 "nvme_io": false, 00:20:47.248 "nvme_io_md": false, 00:20:47.248 "write_zeroes": true, 00:20:47.248 "zcopy": true, 00:20:47.248 "get_zone_info": false, 00:20:47.248 "zone_management": false, 00:20:47.248 "zone_append": false, 00:20:47.248 "compare": false, 00:20:47.248 "compare_and_write": false, 00:20:47.248 "abort": true, 00:20:47.248 "seek_hole": false, 00:20:47.248 "seek_data": false, 00:20:47.248 "copy": true, 00:20:47.248 "nvme_iov_md": false 00:20:47.248 }, 00:20:47.248 "memory_domains": [ 00:20:47.248 { 00:20:47.248 "dma_device_id": "system", 00:20:47.248 "dma_device_type": 1 00:20:47.248 }, 00:20:47.248 { 00:20:47.248 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.248 "dma_device_type": 2 00:20:47.248 } 00:20:47.248 ], 00:20:47.248 "driver_specific": {} 00:20:47.248 }' 00:20:47.248 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.248 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:47.506 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:47.506 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.506 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:47.506 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:47.506 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.506 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:47.506 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:47.506 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.765 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:47.765 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:47.765 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:47.765 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:47.765 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:48.024 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:48.024 "name": "BaseBdev2", 00:20:48.024 "aliases": [ 00:20:48.024 "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130" 00:20:48.024 ], 00:20:48.024 "product_name": "Malloc disk", 00:20:48.024 "block_size": 512, 00:20:48.024 "num_blocks": 65536, 00:20:48.024 "uuid": "2251c7f6-3ac5-4d0b-a627-0ce7fc7f6130", 00:20:48.024 "assigned_rate_limits": { 00:20:48.024 "rw_ios_per_sec": 0, 00:20:48.024 "rw_mbytes_per_sec": 0, 00:20:48.024 "r_mbytes_per_sec": 0, 00:20:48.024 "w_mbytes_per_sec": 0 00:20:48.024 }, 00:20:48.024 "claimed": true, 00:20:48.024 "claim_type": "exclusive_write", 00:20:48.024 "zoned": false, 00:20:48.024 "supported_io_types": { 00:20:48.024 "read": true, 00:20:48.024 "write": true, 00:20:48.024 "unmap": true, 00:20:48.024 "flush": true, 00:20:48.024 "reset": true, 00:20:48.024 "nvme_admin": false, 00:20:48.024 "nvme_io": false, 00:20:48.024 "nvme_io_md": false, 00:20:48.024 "write_zeroes": true, 00:20:48.024 "zcopy": true, 00:20:48.024 "get_zone_info": false, 00:20:48.024 "zone_management": false, 00:20:48.024 "zone_append": false, 00:20:48.024 "compare": false, 00:20:48.024 "compare_and_write": false, 00:20:48.024 "abort": true, 00:20:48.024 "seek_hole": false, 00:20:48.024 "seek_data": false, 00:20:48.024 "copy": true, 00:20:48.024 "nvme_iov_md": false 00:20:48.024 }, 00:20:48.024 "memory_domains": [ 00:20:48.024 { 00:20:48.024 "dma_device_id": "system", 00:20:48.024 "dma_device_type": 1 00:20:48.024 }, 00:20:48.024 { 00:20:48.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.024 "dma_device_type": 2 00:20:48.024 } 00:20:48.024 ], 00:20:48.024 "driver_specific": {} 00:20:48.024 }' 00:20:48.024 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.024 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.024 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:48.024 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.024 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.024 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:48.024 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.283 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.283 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:48.283 05:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.283 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.283 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:48.283 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:48.283 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:48.283 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:48.541 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:48.541 "name": "BaseBdev3", 00:20:48.541 "aliases": [ 00:20:48.541 "9c402a8c-79df-4e5a-ad62-8807b94667c9" 00:20:48.541 ], 00:20:48.541 "product_name": "Malloc disk", 00:20:48.541 "block_size": 512, 00:20:48.541 "num_blocks": 65536, 00:20:48.541 "uuid": "9c402a8c-79df-4e5a-ad62-8807b94667c9", 00:20:48.541 "assigned_rate_limits": { 00:20:48.541 "rw_ios_per_sec": 0, 00:20:48.541 "rw_mbytes_per_sec": 0, 00:20:48.541 "r_mbytes_per_sec": 0, 00:20:48.541 "w_mbytes_per_sec": 0 00:20:48.541 }, 00:20:48.541 "claimed": true, 00:20:48.541 "claim_type": "exclusive_write", 00:20:48.541 "zoned": false, 00:20:48.541 "supported_io_types": { 00:20:48.541 "read": true, 00:20:48.541 "write": true, 00:20:48.541 "unmap": true, 00:20:48.541 "flush": true, 00:20:48.541 "reset": true, 00:20:48.541 "nvme_admin": false, 00:20:48.541 "nvme_io": false, 00:20:48.541 "nvme_io_md": false, 00:20:48.541 "write_zeroes": true, 00:20:48.541 "zcopy": true, 00:20:48.541 "get_zone_info": false, 00:20:48.541 "zone_management": false, 00:20:48.541 "zone_append": false, 00:20:48.541 "compare": false, 00:20:48.541 "compare_and_write": false, 00:20:48.541 "abort": true, 00:20:48.541 "seek_hole": false, 00:20:48.541 "seek_data": false, 00:20:48.541 "copy": true, 00:20:48.541 "nvme_iov_md": false 00:20:48.541 }, 00:20:48.541 "memory_domains": [ 00:20:48.541 { 00:20:48.541 "dma_device_id": "system", 00:20:48.542 "dma_device_type": 1 00:20:48.542 }, 00:20:48.542 { 00:20:48.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.542 "dma_device_type": 2 00:20:48.542 } 00:20:48.542 ], 00:20:48.542 "driver_specific": {} 00:20:48.542 }' 00:20:48.542 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.542 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:48.542 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:48.542 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:48.803 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:49.062 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:49.062 "name": "BaseBdev4", 00:20:49.062 "aliases": [ 00:20:49.062 "728b2cbc-fe16-4d30-9ebb-966c4e275069" 00:20:49.062 ], 00:20:49.062 "product_name": "Malloc disk", 00:20:49.062 "block_size": 512, 00:20:49.062 "num_blocks": 65536, 00:20:49.062 "uuid": "728b2cbc-fe16-4d30-9ebb-966c4e275069", 00:20:49.062 "assigned_rate_limits": { 00:20:49.062 "rw_ios_per_sec": 0, 00:20:49.062 "rw_mbytes_per_sec": 0, 00:20:49.062 "r_mbytes_per_sec": 0, 00:20:49.062 "w_mbytes_per_sec": 0 00:20:49.062 }, 00:20:49.062 "claimed": true, 00:20:49.062 "claim_type": "exclusive_write", 00:20:49.062 "zoned": false, 00:20:49.062 "supported_io_types": { 00:20:49.062 "read": true, 00:20:49.062 "write": true, 00:20:49.062 "unmap": true, 00:20:49.062 "flush": true, 00:20:49.062 "reset": true, 00:20:49.062 "nvme_admin": false, 00:20:49.062 "nvme_io": false, 00:20:49.062 "nvme_io_md": false, 00:20:49.062 "write_zeroes": true, 00:20:49.062 "zcopy": true, 00:20:49.062 "get_zone_info": false, 00:20:49.062 "zone_management": false, 00:20:49.062 "zone_append": false, 00:20:49.062 "compare": false, 00:20:49.062 "compare_and_write": false, 00:20:49.062 "abort": true, 00:20:49.062 "seek_hole": false, 00:20:49.062 "seek_data": false, 00:20:49.062 "copy": true, 00:20:49.062 "nvme_iov_md": false 00:20:49.062 }, 00:20:49.062 "memory_domains": [ 00:20:49.062 { 00:20:49.062 "dma_device_id": "system", 00:20:49.062 "dma_device_type": 1 00:20:49.062 }, 00:20:49.062 { 00:20:49.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.062 "dma_device_type": 2 00:20:49.062 } 00:20:49.062 ], 00:20:49.062 "driver_specific": {} 00:20:49.062 }' 00:20:49.062 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:49.321 05:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:49.321 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:49.321 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:49.321 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:49.321 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:49.321 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:49.321 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:49.321 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:49.321 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:49.580 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:49.580 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:49.580 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:49.839 [2024-07-26 05:49:04.498075] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:49.839 [2024-07-26 05:49:04.498104] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:49.839 [2024-07-26 05:49:04.498160] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:49.839 [2024-07-26 05:49:04.498224] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:49.839 [2024-07-26 05:49:04.498235] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fd2470 name Existed_Raid, state offline 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1193284 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1193284 ']' 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1193284 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1193284 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1193284' 00:20:49.839 killing process with pid 1193284 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1193284 00:20:49.839 [2024-07-26 05:49:04.569419] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:49.839 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1193284 00:20:49.839 [2024-07-26 05:49:04.612008] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:50.098 05:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:50.098 00:20:50.098 real 0m34.009s 00:20:50.098 user 1m2.351s 00:20:50.098 sys 0m6.049s 00:20:50.098 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:50.098 05:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:50.098 ************************************ 00:20:50.098 END TEST raid_state_function_test_sb 00:20:50.098 ************************************ 00:20:50.098 05:49:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:50.098 05:49:04 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:20:50.098 05:49:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:50.099 05:49:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:50.099 05:49:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:50.099 ************************************ 00:20:50.099 START TEST raid_superblock_test 00:20:50.099 ************************************ 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1198492 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1198492 /var/tmp/spdk-raid.sock 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1198492 ']' 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:50.099 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:50.099 05:49:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.099 [2024-07-26 05:49:04.971161] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:20:50.099 [2024-07-26 05:49:04.971230] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1198492 ] 00:20:50.357 [2024-07-26 05:49:05.090705] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:50.357 [2024-07-26 05:49:05.192438] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:50.357 [2024-07-26 05:49:05.253294] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:50.357 [2024-07-26 05:49:05.253342] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:51.293 05:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:51.293 malloc1 00:20:51.293 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:51.552 [2024-07-26 05:49:06.389235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:51.552 [2024-07-26 05:49:06.389285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:51.552 [2024-07-26 05:49:06.389306] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc55570 00:20:51.552 [2024-07-26 05:49:06.389319] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:51.552 [2024-07-26 05:49:06.390909] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:51.552 [2024-07-26 05:49:06.390938] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:51.552 pt1 00:20:51.552 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:51.552 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:51.552 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:51.552 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:51.552 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:51.552 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:51.552 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:51.552 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:51.552 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:51.810 malloc2 00:20:51.810 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:52.069 [2024-07-26 05:49:06.887303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:52.069 [2024-07-26 05:49:06.887352] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:52.069 [2024-07-26 05:49:06.887370] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc56970 00:20:52.069 [2024-07-26 05:49:06.887382] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:52.069 [2024-07-26 05:49:06.888886] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:52.069 [2024-07-26 05:49:06.888914] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:52.069 pt2 00:20:52.070 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:52.070 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:52.070 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:52.070 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:52.070 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:52.070 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:52.070 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:52.070 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:52.070 05:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:52.329 malloc3 00:20:52.329 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:52.588 [2024-07-26 05:49:07.381228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:52.588 [2024-07-26 05:49:07.381278] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:52.588 [2024-07-26 05:49:07.381296] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xded340 00:20:52.588 [2024-07-26 05:49:07.381308] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:52.588 [2024-07-26 05:49:07.382837] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:52.588 [2024-07-26 05:49:07.382866] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:52.588 pt3 00:20:52.588 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:52.588 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:52.588 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:52.588 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:52.588 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:52.588 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:52.588 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:52.588 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:52.588 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:52.847 malloc4 00:20:52.847 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:53.106 [2024-07-26 05:49:07.879150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:53.106 [2024-07-26 05:49:07.879195] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:53.106 [2024-07-26 05:49:07.879213] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdefc60 00:20:53.106 [2024-07-26 05:49:07.879226] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:53.106 [2024-07-26 05:49:07.880586] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:53.106 [2024-07-26 05:49:07.880613] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:53.106 pt4 00:20:53.106 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:53.106 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:53.106 05:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:53.365 [2024-07-26 05:49:08.123818] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:53.365 [2024-07-26 05:49:08.124967] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:53.365 [2024-07-26 05:49:08.125021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:53.365 [2024-07-26 05:49:08.125064] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:53.365 [2024-07-26 05:49:08.125227] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc4d530 00:20:53.365 [2024-07-26 05:49:08.125238] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:53.365 [2024-07-26 05:49:08.125411] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc4b770 00:20:53.365 [2024-07-26 05:49:08.125549] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc4d530 00:20:53.365 [2024-07-26 05:49:08.125559] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc4d530 00:20:53.365 [2024-07-26 05:49:08.125654] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.365 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:53.624 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.624 "name": "raid_bdev1", 00:20:53.624 "uuid": "b441125c-3e58-462b-b537-1a2e29abf9b1", 00:20:53.624 "strip_size_kb": 64, 00:20:53.624 "state": "online", 00:20:53.624 "raid_level": "raid0", 00:20:53.624 "superblock": true, 00:20:53.624 "num_base_bdevs": 4, 00:20:53.624 "num_base_bdevs_discovered": 4, 00:20:53.624 "num_base_bdevs_operational": 4, 00:20:53.624 "base_bdevs_list": [ 00:20:53.624 { 00:20:53.624 "name": "pt1", 00:20:53.624 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:53.624 "is_configured": true, 00:20:53.624 "data_offset": 2048, 00:20:53.624 "data_size": 63488 00:20:53.624 }, 00:20:53.624 { 00:20:53.624 "name": "pt2", 00:20:53.624 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:53.624 "is_configured": true, 00:20:53.624 "data_offset": 2048, 00:20:53.624 "data_size": 63488 00:20:53.624 }, 00:20:53.624 { 00:20:53.624 "name": "pt3", 00:20:53.624 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:53.624 "is_configured": true, 00:20:53.624 "data_offset": 2048, 00:20:53.624 "data_size": 63488 00:20:53.624 }, 00:20:53.624 { 00:20:53.624 "name": "pt4", 00:20:53.624 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:53.624 "is_configured": true, 00:20:53.624 "data_offset": 2048, 00:20:53.624 "data_size": 63488 00:20:53.624 } 00:20:53.624 ] 00:20:53.624 }' 00:20:53.624 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.624 05:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:54.191 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:54.191 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:54.191 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:54.191 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:54.191 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:54.191 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:54.191 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:54.191 05:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:54.451 [2024-07-26 05:49:09.150832] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:54.451 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:54.451 "name": "raid_bdev1", 00:20:54.451 "aliases": [ 00:20:54.451 "b441125c-3e58-462b-b537-1a2e29abf9b1" 00:20:54.451 ], 00:20:54.451 "product_name": "Raid Volume", 00:20:54.451 "block_size": 512, 00:20:54.451 "num_blocks": 253952, 00:20:54.451 "uuid": "b441125c-3e58-462b-b537-1a2e29abf9b1", 00:20:54.451 "assigned_rate_limits": { 00:20:54.451 "rw_ios_per_sec": 0, 00:20:54.451 "rw_mbytes_per_sec": 0, 00:20:54.451 "r_mbytes_per_sec": 0, 00:20:54.451 "w_mbytes_per_sec": 0 00:20:54.451 }, 00:20:54.451 "claimed": false, 00:20:54.451 "zoned": false, 00:20:54.451 "supported_io_types": { 00:20:54.451 "read": true, 00:20:54.451 "write": true, 00:20:54.451 "unmap": true, 00:20:54.451 "flush": true, 00:20:54.451 "reset": true, 00:20:54.451 "nvme_admin": false, 00:20:54.451 "nvme_io": false, 00:20:54.451 "nvme_io_md": false, 00:20:54.451 "write_zeroes": true, 00:20:54.451 "zcopy": false, 00:20:54.451 "get_zone_info": false, 00:20:54.451 "zone_management": false, 00:20:54.451 "zone_append": false, 00:20:54.451 "compare": false, 00:20:54.451 "compare_and_write": false, 00:20:54.451 "abort": false, 00:20:54.451 "seek_hole": false, 00:20:54.451 "seek_data": false, 00:20:54.451 "copy": false, 00:20:54.451 "nvme_iov_md": false 00:20:54.451 }, 00:20:54.451 "memory_domains": [ 00:20:54.451 { 00:20:54.451 "dma_device_id": "system", 00:20:54.451 "dma_device_type": 1 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.451 "dma_device_type": 2 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "dma_device_id": "system", 00:20:54.451 "dma_device_type": 1 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.451 "dma_device_type": 2 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "dma_device_id": "system", 00:20:54.451 "dma_device_type": 1 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.451 "dma_device_type": 2 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "dma_device_id": "system", 00:20:54.451 "dma_device_type": 1 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.451 "dma_device_type": 2 00:20:54.451 } 00:20:54.451 ], 00:20:54.451 "driver_specific": { 00:20:54.451 "raid": { 00:20:54.451 "uuid": "b441125c-3e58-462b-b537-1a2e29abf9b1", 00:20:54.451 "strip_size_kb": 64, 00:20:54.451 "state": "online", 00:20:54.451 "raid_level": "raid0", 00:20:54.451 "superblock": true, 00:20:54.451 "num_base_bdevs": 4, 00:20:54.451 "num_base_bdevs_discovered": 4, 00:20:54.451 "num_base_bdevs_operational": 4, 00:20:54.451 "base_bdevs_list": [ 00:20:54.451 { 00:20:54.451 "name": "pt1", 00:20:54.451 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:54.451 "is_configured": true, 00:20:54.451 "data_offset": 2048, 00:20:54.451 "data_size": 63488 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "name": "pt2", 00:20:54.451 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:54.451 "is_configured": true, 00:20:54.451 "data_offset": 2048, 00:20:54.451 "data_size": 63488 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "name": "pt3", 00:20:54.451 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:54.451 "is_configured": true, 00:20:54.451 "data_offset": 2048, 00:20:54.451 "data_size": 63488 00:20:54.451 }, 00:20:54.451 { 00:20:54.451 "name": "pt4", 00:20:54.451 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:54.451 "is_configured": true, 00:20:54.451 "data_offset": 2048, 00:20:54.451 "data_size": 63488 00:20:54.451 } 00:20:54.451 ] 00:20:54.451 } 00:20:54.451 } 00:20:54.451 }' 00:20:54.451 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:54.451 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:54.451 pt2 00:20:54.451 pt3 00:20:54.451 pt4' 00:20:54.451 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.451 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:54.451 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.710 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.710 "name": "pt1", 00:20:54.710 "aliases": [ 00:20:54.710 "00000000-0000-0000-0000-000000000001" 00:20:54.710 ], 00:20:54.710 "product_name": "passthru", 00:20:54.710 "block_size": 512, 00:20:54.710 "num_blocks": 65536, 00:20:54.711 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:54.711 "assigned_rate_limits": { 00:20:54.711 "rw_ios_per_sec": 0, 00:20:54.711 "rw_mbytes_per_sec": 0, 00:20:54.711 "r_mbytes_per_sec": 0, 00:20:54.711 "w_mbytes_per_sec": 0 00:20:54.711 }, 00:20:54.711 "claimed": true, 00:20:54.711 "claim_type": "exclusive_write", 00:20:54.711 "zoned": false, 00:20:54.711 "supported_io_types": { 00:20:54.711 "read": true, 00:20:54.711 "write": true, 00:20:54.711 "unmap": true, 00:20:54.711 "flush": true, 00:20:54.711 "reset": true, 00:20:54.711 "nvme_admin": false, 00:20:54.711 "nvme_io": false, 00:20:54.711 "nvme_io_md": false, 00:20:54.711 "write_zeroes": true, 00:20:54.711 "zcopy": true, 00:20:54.711 "get_zone_info": false, 00:20:54.711 "zone_management": false, 00:20:54.711 "zone_append": false, 00:20:54.711 "compare": false, 00:20:54.711 "compare_and_write": false, 00:20:54.711 "abort": true, 00:20:54.711 "seek_hole": false, 00:20:54.711 "seek_data": false, 00:20:54.711 "copy": true, 00:20:54.711 "nvme_iov_md": false 00:20:54.711 }, 00:20:54.711 "memory_domains": [ 00:20:54.711 { 00:20:54.711 "dma_device_id": "system", 00:20:54.711 "dma_device_type": 1 00:20:54.711 }, 00:20:54.711 { 00:20:54.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.711 "dma_device_type": 2 00:20:54.711 } 00:20:54.711 ], 00:20:54.711 "driver_specific": { 00:20:54.711 "passthru": { 00:20:54.711 "name": "pt1", 00:20:54.711 "base_bdev_name": "malloc1" 00:20:54.711 } 00:20:54.711 } 00:20:54.711 }' 00:20:54.711 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.711 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.711 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:54.711 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:54.970 05:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:55.229 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:55.229 "name": "pt2", 00:20:55.229 "aliases": [ 00:20:55.229 "00000000-0000-0000-0000-000000000002" 00:20:55.229 ], 00:20:55.229 "product_name": "passthru", 00:20:55.229 "block_size": 512, 00:20:55.229 "num_blocks": 65536, 00:20:55.229 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:55.229 "assigned_rate_limits": { 00:20:55.229 "rw_ios_per_sec": 0, 00:20:55.229 "rw_mbytes_per_sec": 0, 00:20:55.229 "r_mbytes_per_sec": 0, 00:20:55.229 "w_mbytes_per_sec": 0 00:20:55.229 }, 00:20:55.229 "claimed": true, 00:20:55.229 "claim_type": "exclusive_write", 00:20:55.229 "zoned": false, 00:20:55.229 "supported_io_types": { 00:20:55.229 "read": true, 00:20:55.229 "write": true, 00:20:55.229 "unmap": true, 00:20:55.229 "flush": true, 00:20:55.229 "reset": true, 00:20:55.229 "nvme_admin": false, 00:20:55.229 "nvme_io": false, 00:20:55.229 "nvme_io_md": false, 00:20:55.229 "write_zeroes": true, 00:20:55.229 "zcopy": true, 00:20:55.229 "get_zone_info": false, 00:20:55.229 "zone_management": false, 00:20:55.229 "zone_append": false, 00:20:55.229 "compare": false, 00:20:55.229 "compare_and_write": false, 00:20:55.229 "abort": true, 00:20:55.229 "seek_hole": false, 00:20:55.229 "seek_data": false, 00:20:55.229 "copy": true, 00:20:55.229 "nvme_iov_md": false 00:20:55.229 }, 00:20:55.229 "memory_domains": [ 00:20:55.229 { 00:20:55.229 "dma_device_id": "system", 00:20:55.229 "dma_device_type": 1 00:20:55.229 }, 00:20:55.229 { 00:20:55.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.229 "dma_device_type": 2 00:20:55.229 } 00:20:55.229 ], 00:20:55.229 "driver_specific": { 00:20:55.229 "passthru": { 00:20:55.229 "name": "pt2", 00:20:55.229 "base_bdev_name": "malloc2" 00:20:55.229 } 00:20:55.229 } 00:20:55.229 }' 00:20:55.229 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.229 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.488 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.488 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.488 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.488 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.488 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.488 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.488 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.488 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.488 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.747 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.747 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:55.747 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:55.747 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:55.747 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:55.747 "name": "pt3", 00:20:55.747 "aliases": [ 00:20:55.747 "00000000-0000-0000-0000-000000000003" 00:20:55.747 ], 00:20:55.747 "product_name": "passthru", 00:20:55.747 "block_size": 512, 00:20:55.747 "num_blocks": 65536, 00:20:55.747 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:55.747 "assigned_rate_limits": { 00:20:55.747 "rw_ios_per_sec": 0, 00:20:55.747 "rw_mbytes_per_sec": 0, 00:20:55.747 "r_mbytes_per_sec": 0, 00:20:55.747 "w_mbytes_per_sec": 0 00:20:55.747 }, 00:20:55.747 "claimed": true, 00:20:55.747 "claim_type": "exclusive_write", 00:20:55.747 "zoned": false, 00:20:55.747 "supported_io_types": { 00:20:55.747 "read": true, 00:20:55.747 "write": true, 00:20:55.747 "unmap": true, 00:20:55.747 "flush": true, 00:20:55.747 "reset": true, 00:20:55.747 "nvme_admin": false, 00:20:55.747 "nvme_io": false, 00:20:55.747 "nvme_io_md": false, 00:20:55.747 "write_zeroes": true, 00:20:55.747 "zcopy": true, 00:20:55.747 "get_zone_info": false, 00:20:55.747 "zone_management": false, 00:20:55.747 "zone_append": false, 00:20:55.747 "compare": false, 00:20:55.747 "compare_and_write": false, 00:20:55.747 "abort": true, 00:20:55.747 "seek_hole": false, 00:20:55.747 "seek_data": false, 00:20:55.747 "copy": true, 00:20:55.747 "nvme_iov_md": false 00:20:55.747 }, 00:20:55.747 "memory_domains": [ 00:20:55.747 { 00:20:55.747 "dma_device_id": "system", 00:20:55.747 "dma_device_type": 1 00:20:55.747 }, 00:20:55.747 { 00:20:55.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.747 "dma_device_type": 2 00:20:55.747 } 00:20:55.747 ], 00:20:55.747 "driver_specific": { 00:20:55.747 "passthru": { 00:20:55.747 "name": "pt3", 00:20:55.747 "base_bdev_name": "malloc3" 00:20:55.747 } 00:20:55.747 } 00:20:55.747 }' 00:20:55.747 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:56.006 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:56.006 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:56.006 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:56.006 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:56.006 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:56.006 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:56.006 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:56.006 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:56.006 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:56.265 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:56.265 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:56.265 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:56.265 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:56.265 05:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:56.524 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:56.524 "name": "pt4", 00:20:56.524 "aliases": [ 00:20:56.524 "00000000-0000-0000-0000-000000000004" 00:20:56.524 ], 00:20:56.524 "product_name": "passthru", 00:20:56.524 "block_size": 512, 00:20:56.524 "num_blocks": 65536, 00:20:56.524 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:56.524 "assigned_rate_limits": { 00:20:56.524 "rw_ios_per_sec": 0, 00:20:56.524 "rw_mbytes_per_sec": 0, 00:20:56.524 "r_mbytes_per_sec": 0, 00:20:56.524 "w_mbytes_per_sec": 0 00:20:56.524 }, 00:20:56.524 "claimed": true, 00:20:56.524 "claim_type": "exclusive_write", 00:20:56.524 "zoned": false, 00:20:56.524 "supported_io_types": { 00:20:56.524 "read": true, 00:20:56.524 "write": true, 00:20:56.524 "unmap": true, 00:20:56.524 "flush": true, 00:20:56.524 "reset": true, 00:20:56.524 "nvme_admin": false, 00:20:56.524 "nvme_io": false, 00:20:56.524 "nvme_io_md": false, 00:20:56.524 "write_zeroes": true, 00:20:56.524 "zcopy": true, 00:20:56.524 "get_zone_info": false, 00:20:56.524 "zone_management": false, 00:20:56.524 "zone_append": false, 00:20:56.524 "compare": false, 00:20:56.524 "compare_and_write": false, 00:20:56.524 "abort": true, 00:20:56.524 "seek_hole": false, 00:20:56.524 "seek_data": false, 00:20:56.524 "copy": true, 00:20:56.524 "nvme_iov_md": false 00:20:56.524 }, 00:20:56.524 "memory_domains": [ 00:20:56.524 { 00:20:56.524 "dma_device_id": "system", 00:20:56.524 "dma_device_type": 1 00:20:56.524 }, 00:20:56.524 { 00:20:56.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.524 "dma_device_type": 2 00:20:56.524 } 00:20:56.524 ], 00:20:56.524 "driver_specific": { 00:20:56.524 "passthru": { 00:20:56.524 "name": "pt4", 00:20:56.524 "base_bdev_name": "malloc4" 00:20:56.524 } 00:20:56.524 } 00:20:56.524 }' 00:20:56.524 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:56.524 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:56.524 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:56.524 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:56.524 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:56.524 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:56.524 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:56.524 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:56.783 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:56.783 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:56.783 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:56.783 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:56.784 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:56.784 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:57.042 [2024-07-26 05:49:11.773751] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:57.042 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b441125c-3e58-462b-b537-1a2e29abf9b1 00:20:57.042 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b441125c-3e58-462b-b537-1a2e29abf9b1 ']' 00:20:57.042 05:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:57.301 [2024-07-26 05:49:12.022105] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:57.301 [2024-07-26 05:49:12.022131] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:57.301 [2024-07-26 05:49:12.022183] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:57.301 [2024-07-26 05:49:12.022248] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:57.301 [2024-07-26 05:49:12.022259] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc4d530 name raid_bdev1, state offline 00:20:57.301 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.301 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:57.560 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:57.560 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:57.560 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:57.560 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:57.818 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:57.818 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:58.077 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:58.077 05:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:58.335 05:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:58.335 05:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:58.613 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:58.871 [2024-07-26 05:49:13.746593] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:58.871 [2024-07-26 05:49:13.748007] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:58.871 [2024-07-26 05:49:13.748055] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:58.871 [2024-07-26 05:49:13.748090] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:58.871 [2024-07-26 05:49:13.748138] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:58.871 [2024-07-26 05:49:13.748181] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:58.871 [2024-07-26 05:49:13.748204] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:58.871 [2024-07-26 05:49:13.748233] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:58.871 [2024-07-26 05:49:13.748251] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:58.871 [2024-07-26 05:49:13.748261] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdf8ff0 name raid_bdev1, state configuring 00:20:58.871 request: 00:20:58.871 { 00:20:58.871 "name": "raid_bdev1", 00:20:58.871 "raid_level": "raid0", 00:20:58.871 "base_bdevs": [ 00:20:58.871 "malloc1", 00:20:58.871 "malloc2", 00:20:58.871 "malloc3", 00:20:58.871 "malloc4" 00:20:58.871 ], 00:20:58.871 "strip_size_kb": 64, 00:20:58.871 "superblock": false, 00:20:58.871 "method": "bdev_raid_create", 00:20:58.871 "req_id": 1 00:20:58.871 } 00:20:58.871 Got JSON-RPC error response 00:20:58.871 response: 00:20:58.871 { 00:20:58.871 "code": -17, 00:20:58.871 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:58.871 } 00:20:58.871 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:58.871 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:58.871 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:58.871 05:49:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:58.871 05:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:58.871 05:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.130 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:59.130 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:59.130 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:59.438 [2024-07-26 05:49:14.239840] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:59.439 [2024-07-26 05:49:14.239895] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:59.439 [2024-07-26 05:49:14.239919] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc557a0 00:20:59.439 [2024-07-26 05:49:14.239932] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:59.439 [2024-07-26 05:49:14.241553] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:59.439 [2024-07-26 05:49:14.241585] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:59.439 [2024-07-26 05:49:14.241670] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:59.439 [2024-07-26 05:49:14.241698] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:59.439 pt1 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.439 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.696 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.696 "name": "raid_bdev1", 00:20:59.696 "uuid": "b441125c-3e58-462b-b537-1a2e29abf9b1", 00:20:59.696 "strip_size_kb": 64, 00:20:59.696 "state": "configuring", 00:20:59.696 "raid_level": "raid0", 00:20:59.696 "superblock": true, 00:20:59.696 "num_base_bdevs": 4, 00:20:59.696 "num_base_bdevs_discovered": 1, 00:20:59.696 "num_base_bdevs_operational": 4, 00:20:59.696 "base_bdevs_list": [ 00:20:59.696 { 00:20:59.696 "name": "pt1", 00:20:59.696 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:59.696 "is_configured": true, 00:20:59.696 "data_offset": 2048, 00:20:59.696 "data_size": 63488 00:20:59.696 }, 00:20:59.696 { 00:20:59.696 "name": null, 00:20:59.696 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:59.696 "is_configured": false, 00:20:59.696 "data_offset": 2048, 00:20:59.696 "data_size": 63488 00:20:59.696 }, 00:20:59.696 { 00:20:59.696 "name": null, 00:20:59.696 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:59.696 "is_configured": false, 00:20:59.696 "data_offset": 2048, 00:20:59.696 "data_size": 63488 00:20:59.696 }, 00:20:59.696 { 00:20:59.696 "name": null, 00:20:59.696 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:59.696 "is_configured": false, 00:20:59.696 "data_offset": 2048, 00:20:59.696 "data_size": 63488 00:20:59.696 } 00:20:59.696 ] 00:20:59.696 }' 00:20:59.696 05:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.696 05:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:00.303 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:21:00.303 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:00.576 [2024-07-26 05:49:15.246519] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:00.576 [2024-07-26 05:49:15.246576] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:00.576 [2024-07-26 05:49:15.246598] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdee940 00:21:00.576 [2024-07-26 05:49:15.246610] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:00.576 [2024-07-26 05:49:15.246973] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:00.576 [2024-07-26 05:49:15.246992] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:00.576 [2024-07-26 05:49:15.247056] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:00.576 [2024-07-26 05:49:15.247076] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:00.577 pt2 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:00.577 [2024-07-26 05:49:15.422990] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.577 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:00.835 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.835 "name": "raid_bdev1", 00:21:00.835 "uuid": "b441125c-3e58-462b-b537-1a2e29abf9b1", 00:21:00.835 "strip_size_kb": 64, 00:21:00.835 "state": "configuring", 00:21:00.835 "raid_level": "raid0", 00:21:00.835 "superblock": true, 00:21:00.835 "num_base_bdevs": 4, 00:21:00.835 "num_base_bdevs_discovered": 1, 00:21:00.835 "num_base_bdevs_operational": 4, 00:21:00.835 "base_bdevs_list": [ 00:21:00.835 { 00:21:00.835 "name": "pt1", 00:21:00.835 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:00.835 "is_configured": true, 00:21:00.835 "data_offset": 2048, 00:21:00.835 "data_size": 63488 00:21:00.835 }, 00:21:00.835 { 00:21:00.835 "name": null, 00:21:00.835 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:00.835 "is_configured": false, 00:21:00.835 "data_offset": 2048, 00:21:00.835 "data_size": 63488 00:21:00.835 }, 00:21:00.835 { 00:21:00.835 "name": null, 00:21:00.835 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:00.835 "is_configured": false, 00:21:00.835 "data_offset": 2048, 00:21:00.835 "data_size": 63488 00:21:00.835 }, 00:21:00.835 { 00:21:00.835 "name": null, 00:21:00.835 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:00.835 "is_configured": false, 00:21:00.835 "data_offset": 2048, 00:21:00.835 "data_size": 63488 00:21:00.835 } 00:21:00.835 ] 00:21:00.835 }' 00:21:00.835 05:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.835 05:49:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:01.404 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:01.404 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:01.404 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:01.664 [2024-07-26 05:49:16.469784] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:01.664 [2024-07-26 05:49:16.469841] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.664 [2024-07-26 05:49:16.469861] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4c060 00:21:01.664 [2024-07-26 05:49:16.469874] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.664 [2024-07-26 05:49:16.470230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.664 [2024-07-26 05:49:16.470248] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:01.664 [2024-07-26 05:49:16.470313] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:01.664 [2024-07-26 05:49:16.470332] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:01.664 pt2 00:21:01.664 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:01.664 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:01.664 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:01.925 [2024-07-26 05:49:16.630205] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:01.925 [2024-07-26 05:49:16.630248] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:01.925 [2024-07-26 05:49:16.630268] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4e8d0 00:21:01.925 [2024-07-26 05:49:16.630281] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:01.925 [2024-07-26 05:49:16.630607] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:01.925 [2024-07-26 05:49:16.630624] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:01.925 [2024-07-26 05:49:16.630695] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:01.925 [2024-07-26 05:49:16.630715] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:01.925 pt3 00:21:01.925 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:01.925 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:01.925 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:02.187 [2024-07-26 05:49:16.874853] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:02.187 [2024-07-26 05:49:16.874904] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:02.187 [2024-07-26 05:49:16.874922] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4fb80 00:21:02.187 [2024-07-26 05:49:16.874934] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:02.187 [2024-07-26 05:49:16.875275] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:02.187 [2024-07-26 05:49:16.875294] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:02.187 [2024-07-26 05:49:16.875356] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:02.187 [2024-07-26 05:49:16.875377] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:02.187 [2024-07-26 05:49:16.875500] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xc4c780 00:21:02.187 [2024-07-26 05:49:16.875512] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:02.187 [2024-07-26 05:49:16.875702] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc51d70 00:21:02.187 [2024-07-26 05:49:16.875839] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc4c780 00:21:02.187 [2024-07-26 05:49:16.875848] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc4c780 00:21:02.187 [2024-07-26 05:49:16.875950] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:02.187 pt4 00:21:02.187 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:02.187 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:02.187 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.188 05:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.447 05:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.448 "name": "raid_bdev1", 00:21:02.448 "uuid": "b441125c-3e58-462b-b537-1a2e29abf9b1", 00:21:02.448 "strip_size_kb": 64, 00:21:02.448 "state": "online", 00:21:02.448 "raid_level": "raid0", 00:21:02.448 "superblock": true, 00:21:02.448 "num_base_bdevs": 4, 00:21:02.448 "num_base_bdevs_discovered": 4, 00:21:02.448 "num_base_bdevs_operational": 4, 00:21:02.448 "base_bdevs_list": [ 00:21:02.448 { 00:21:02.448 "name": "pt1", 00:21:02.448 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:02.448 "is_configured": true, 00:21:02.448 "data_offset": 2048, 00:21:02.448 "data_size": 63488 00:21:02.448 }, 00:21:02.448 { 00:21:02.448 "name": "pt2", 00:21:02.448 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:02.448 "is_configured": true, 00:21:02.448 "data_offset": 2048, 00:21:02.448 "data_size": 63488 00:21:02.448 }, 00:21:02.448 { 00:21:02.448 "name": "pt3", 00:21:02.448 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:02.448 "is_configured": true, 00:21:02.448 "data_offset": 2048, 00:21:02.448 "data_size": 63488 00:21:02.448 }, 00:21:02.448 { 00:21:02.448 "name": "pt4", 00:21:02.448 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:02.448 "is_configured": true, 00:21:02.448 "data_offset": 2048, 00:21:02.448 "data_size": 63488 00:21:02.448 } 00:21:02.448 ] 00:21:02.448 }' 00:21:02.448 05:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.448 05:49:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:03.386 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:03.386 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:03.386 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:03.386 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:03.386 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:03.386 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:03.386 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:03.386 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:03.386 [2024-07-26 05:49:18.238792] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:03.386 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:03.386 "name": "raid_bdev1", 00:21:03.386 "aliases": [ 00:21:03.386 "b441125c-3e58-462b-b537-1a2e29abf9b1" 00:21:03.386 ], 00:21:03.386 "product_name": "Raid Volume", 00:21:03.386 "block_size": 512, 00:21:03.386 "num_blocks": 253952, 00:21:03.386 "uuid": "b441125c-3e58-462b-b537-1a2e29abf9b1", 00:21:03.386 "assigned_rate_limits": { 00:21:03.386 "rw_ios_per_sec": 0, 00:21:03.386 "rw_mbytes_per_sec": 0, 00:21:03.386 "r_mbytes_per_sec": 0, 00:21:03.386 "w_mbytes_per_sec": 0 00:21:03.386 }, 00:21:03.386 "claimed": false, 00:21:03.386 "zoned": false, 00:21:03.386 "supported_io_types": { 00:21:03.386 "read": true, 00:21:03.386 "write": true, 00:21:03.386 "unmap": true, 00:21:03.386 "flush": true, 00:21:03.386 "reset": true, 00:21:03.386 "nvme_admin": false, 00:21:03.386 "nvme_io": false, 00:21:03.386 "nvme_io_md": false, 00:21:03.386 "write_zeroes": true, 00:21:03.386 "zcopy": false, 00:21:03.386 "get_zone_info": false, 00:21:03.386 "zone_management": false, 00:21:03.386 "zone_append": false, 00:21:03.386 "compare": false, 00:21:03.386 "compare_and_write": false, 00:21:03.386 "abort": false, 00:21:03.386 "seek_hole": false, 00:21:03.386 "seek_data": false, 00:21:03.386 "copy": false, 00:21:03.386 "nvme_iov_md": false 00:21:03.386 }, 00:21:03.387 "memory_domains": [ 00:21:03.387 { 00:21:03.387 "dma_device_id": "system", 00:21:03.387 "dma_device_type": 1 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.387 "dma_device_type": 2 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "dma_device_id": "system", 00:21:03.387 "dma_device_type": 1 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.387 "dma_device_type": 2 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "dma_device_id": "system", 00:21:03.387 "dma_device_type": 1 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.387 "dma_device_type": 2 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "dma_device_id": "system", 00:21:03.387 "dma_device_type": 1 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.387 "dma_device_type": 2 00:21:03.387 } 00:21:03.387 ], 00:21:03.387 "driver_specific": { 00:21:03.387 "raid": { 00:21:03.387 "uuid": "b441125c-3e58-462b-b537-1a2e29abf9b1", 00:21:03.387 "strip_size_kb": 64, 00:21:03.387 "state": "online", 00:21:03.387 "raid_level": "raid0", 00:21:03.387 "superblock": true, 00:21:03.387 "num_base_bdevs": 4, 00:21:03.387 "num_base_bdevs_discovered": 4, 00:21:03.387 "num_base_bdevs_operational": 4, 00:21:03.387 "base_bdevs_list": [ 00:21:03.387 { 00:21:03.387 "name": "pt1", 00:21:03.387 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:03.387 "is_configured": true, 00:21:03.387 "data_offset": 2048, 00:21:03.387 "data_size": 63488 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "name": "pt2", 00:21:03.387 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:03.387 "is_configured": true, 00:21:03.387 "data_offset": 2048, 00:21:03.387 "data_size": 63488 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "name": "pt3", 00:21:03.387 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:03.387 "is_configured": true, 00:21:03.387 "data_offset": 2048, 00:21:03.387 "data_size": 63488 00:21:03.387 }, 00:21:03.387 { 00:21:03.387 "name": "pt4", 00:21:03.387 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:03.387 "is_configured": true, 00:21:03.387 "data_offset": 2048, 00:21:03.387 "data_size": 63488 00:21:03.387 } 00:21:03.387 ] 00:21:03.387 } 00:21:03.387 } 00:21:03.387 }' 00:21:03.387 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:03.646 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:03.646 pt2 00:21:03.646 pt3 00:21:03.646 pt4' 00:21:03.646 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:03.646 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:03.646 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:03.905 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:03.905 "name": "pt1", 00:21:03.905 "aliases": [ 00:21:03.905 "00000000-0000-0000-0000-000000000001" 00:21:03.905 ], 00:21:03.905 "product_name": "passthru", 00:21:03.905 "block_size": 512, 00:21:03.905 "num_blocks": 65536, 00:21:03.905 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:03.905 "assigned_rate_limits": { 00:21:03.905 "rw_ios_per_sec": 0, 00:21:03.905 "rw_mbytes_per_sec": 0, 00:21:03.905 "r_mbytes_per_sec": 0, 00:21:03.905 "w_mbytes_per_sec": 0 00:21:03.905 }, 00:21:03.905 "claimed": true, 00:21:03.905 "claim_type": "exclusive_write", 00:21:03.905 "zoned": false, 00:21:03.905 "supported_io_types": { 00:21:03.905 "read": true, 00:21:03.905 "write": true, 00:21:03.905 "unmap": true, 00:21:03.905 "flush": true, 00:21:03.905 "reset": true, 00:21:03.905 "nvme_admin": false, 00:21:03.905 "nvme_io": false, 00:21:03.905 "nvme_io_md": false, 00:21:03.906 "write_zeroes": true, 00:21:03.906 "zcopy": true, 00:21:03.906 "get_zone_info": false, 00:21:03.906 "zone_management": false, 00:21:03.906 "zone_append": false, 00:21:03.906 "compare": false, 00:21:03.906 "compare_and_write": false, 00:21:03.906 "abort": true, 00:21:03.906 "seek_hole": false, 00:21:03.906 "seek_data": false, 00:21:03.906 "copy": true, 00:21:03.906 "nvme_iov_md": false 00:21:03.906 }, 00:21:03.906 "memory_domains": [ 00:21:03.906 { 00:21:03.906 "dma_device_id": "system", 00:21:03.906 "dma_device_type": 1 00:21:03.906 }, 00:21:03.906 { 00:21:03.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.906 "dma_device_type": 2 00:21:03.906 } 00:21:03.906 ], 00:21:03.906 "driver_specific": { 00:21:03.906 "passthru": { 00:21:03.906 "name": "pt1", 00:21:03.906 "base_bdev_name": "malloc1" 00:21:03.906 } 00:21:03.906 } 00:21:03.906 }' 00:21:03.906 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.906 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.906 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:03.906 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.906 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.906 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:03.906 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.906 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:04.165 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:04.165 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:04.165 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:04.165 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:04.165 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:04.165 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:04.165 05:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:04.424 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:04.424 "name": "pt2", 00:21:04.424 "aliases": [ 00:21:04.424 "00000000-0000-0000-0000-000000000002" 00:21:04.424 ], 00:21:04.424 "product_name": "passthru", 00:21:04.424 "block_size": 512, 00:21:04.424 "num_blocks": 65536, 00:21:04.424 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:04.424 "assigned_rate_limits": { 00:21:04.424 "rw_ios_per_sec": 0, 00:21:04.424 "rw_mbytes_per_sec": 0, 00:21:04.424 "r_mbytes_per_sec": 0, 00:21:04.424 "w_mbytes_per_sec": 0 00:21:04.424 }, 00:21:04.424 "claimed": true, 00:21:04.424 "claim_type": "exclusive_write", 00:21:04.424 "zoned": false, 00:21:04.424 "supported_io_types": { 00:21:04.424 "read": true, 00:21:04.424 "write": true, 00:21:04.424 "unmap": true, 00:21:04.424 "flush": true, 00:21:04.424 "reset": true, 00:21:04.424 "nvme_admin": false, 00:21:04.424 "nvme_io": false, 00:21:04.424 "nvme_io_md": false, 00:21:04.424 "write_zeroes": true, 00:21:04.424 "zcopy": true, 00:21:04.424 "get_zone_info": false, 00:21:04.424 "zone_management": false, 00:21:04.424 "zone_append": false, 00:21:04.424 "compare": false, 00:21:04.424 "compare_and_write": false, 00:21:04.424 "abort": true, 00:21:04.424 "seek_hole": false, 00:21:04.424 "seek_data": false, 00:21:04.424 "copy": true, 00:21:04.424 "nvme_iov_md": false 00:21:04.424 }, 00:21:04.424 "memory_domains": [ 00:21:04.424 { 00:21:04.424 "dma_device_id": "system", 00:21:04.424 "dma_device_type": 1 00:21:04.424 }, 00:21:04.424 { 00:21:04.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.424 "dma_device_type": 2 00:21:04.424 } 00:21:04.424 ], 00:21:04.424 "driver_specific": { 00:21:04.424 "passthru": { 00:21:04.424 "name": "pt2", 00:21:04.424 "base_bdev_name": "malloc2" 00:21:04.424 } 00:21:04.424 } 00:21:04.424 }' 00:21:04.424 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:04.424 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:04.424 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:04.424 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:04.424 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:04.683 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:04.943 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:04.943 "name": "pt3", 00:21:04.943 "aliases": [ 00:21:04.943 "00000000-0000-0000-0000-000000000003" 00:21:04.943 ], 00:21:04.943 "product_name": "passthru", 00:21:04.943 "block_size": 512, 00:21:04.943 "num_blocks": 65536, 00:21:04.943 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:04.943 "assigned_rate_limits": { 00:21:04.943 "rw_ios_per_sec": 0, 00:21:04.943 "rw_mbytes_per_sec": 0, 00:21:04.943 "r_mbytes_per_sec": 0, 00:21:04.943 "w_mbytes_per_sec": 0 00:21:04.943 }, 00:21:04.943 "claimed": true, 00:21:04.943 "claim_type": "exclusive_write", 00:21:04.943 "zoned": false, 00:21:04.943 "supported_io_types": { 00:21:04.943 "read": true, 00:21:04.943 "write": true, 00:21:04.943 "unmap": true, 00:21:04.943 "flush": true, 00:21:04.943 "reset": true, 00:21:04.943 "nvme_admin": false, 00:21:04.943 "nvme_io": false, 00:21:04.943 "nvme_io_md": false, 00:21:04.943 "write_zeroes": true, 00:21:04.943 "zcopy": true, 00:21:04.943 "get_zone_info": false, 00:21:04.943 "zone_management": false, 00:21:04.943 "zone_append": false, 00:21:04.943 "compare": false, 00:21:04.943 "compare_and_write": false, 00:21:04.943 "abort": true, 00:21:04.943 "seek_hole": false, 00:21:04.943 "seek_data": false, 00:21:04.943 "copy": true, 00:21:04.943 "nvme_iov_md": false 00:21:04.943 }, 00:21:04.943 "memory_domains": [ 00:21:04.943 { 00:21:04.943 "dma_device_id": "system", 00:21:04.943 "dma_device_type": 1 00:21:04.943 }, 00:21:04.943 { 00:21:04.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.943 "dma_device_type": 2 00:21:04.943 } 00:21:04.943 ], 00:21:04.943 "driver_specific": { 00:21:04.943 "passthru": { 00:21:04.943 "name": "pt3", 00:21:04.943 "base_bdev_name": "malloc3" 00:21:04.943 } 00:21:04.943 } 00:21:04.943 }' 00:21:04.943 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:04.943 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:04.943 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:04.943 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:05.202 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:05.202 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:05.202 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:05.202 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:05.202 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:05.202 05:49:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:05.202 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:05.202 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:05.202 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:05.202 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:05.202 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:05.461 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:05.461 "name": "pt4", 00:21:05.461 "aliases": [ 00:21:05.461 "00000000-0000-0000-0000-000000000004" 00:21:05.461 ], 00:21:05.461 "product_name": "passthru", 00:21:05.461 "block_size": 512, 00:21:05.461 "num_blocks": 65536, 00:21:05.461 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:05.461 "assigned_rate_limits": { 00:21:05.461 "rw_ios_per_sec": 0, 00:21:05.461 "rw_mbytes_per_sec": 0, 00:21:05.461 "r_mbytes_per_sec": 0, 00:21:05.461 "w_mbytes_per_sec": 0 00:21:05.461 }, 00:21:05.461 "claimed": true, 00:21:05.461 "claim_type": "exclusive_write", 00:21:05.461 "zoned": false, 00:21:05.461 "supported_io_types": { 00:21:05.461 "read": true, 00:21:05.461 "write": true, 00:21:05.461 "unmap": true, 00:21:05.461 "flush": true, 00:21:05.461 "reset": true, 00:21:05.461 "nvme_admin": false, 00:21:05.461 "nvme_io": false, 00:21:05.461 "nvme_io_md": false, 00:21:05.461 "write_zeroes": true, 00:21:05.461 "zcopy": true, 00:21:05.461 "get_zone_info": false, 00:21:05.461 "zone_management": false, 00:21:05.461 "zone_append": false, 00:21:05.461 "compare": false, 00:21:05.461 "compare_and_write": false, 00:21:05.461 "abort": true, 00:21:05.461 "seek_hole": false, 00:21:05.461 "seek_data": false, 00:21:05.461 "copy": true, 00:21:05.461 "nvme_iov_md": false 00:21:05.461 }, 00:21:05.461 "memory_domains": [ 00:21:05.461 { 00:21:05.461 "dma_device_id": "system", 00:21:05.461 "dma_device_type": 1 00:21:05.461 }, 00:21:05.461 { 00:21:05.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.461 "dma_device_type": 2 00:21:05.461 } 00:21:05.461 ], 00:21:05.461 "driver_specific": { 00:21:05.461 "passthru": { 00:21:05.461 "name": "pt4", 00:21:05.461 "base_bdev_name": "malloc4" 00:21:05.461 } 00:21:05.461 } 00:21:05.461 }' 00:21:05.461 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:05.461 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:05.719 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:05.719 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:05.719 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:05.719 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:05.719 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:05.719 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:05.719 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:05.719 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:05.719 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:05.977 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:05.978 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:05.978 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:05.978 [2024-07-26 05:49:20.885764] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b441125c-3e58-462b-b537-1a2e29abf9b1 '!=' b441125c-3e58-462b-b537-1a2e29abf9b1 ']' 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1198492 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1198492 ']' 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1198492 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1198492 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:06.236 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1198492' 00:21:06.237 killing process with pid 1198492 00:21:06.237 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1198492 00:21:06.237 [2024-07-26 05:49:20.959894] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:06.237 [2024-07-26 05:49:20.959958] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:06.237 [2024-07-26 05:49:20.960019] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:06.237 [2024-07-26 05:49:20.960032] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc4c780 name raid_bdev1, state offline 00:21:06.237 05:49:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1198492 00:21:06.237 [2024-07-26 05:49:20.999764] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:06.496 05:49:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:21:06.496 00:21:06.496 real 0m16.320s 00:21:06.496 user 0m29.506s 00:21:06.496 sys 0m2.896s 00:21:06.496 05:49:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:06.496 05:49:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:06.496 ************************************ 00:21:06.496 END TEST raid_superblock_test 00:21:06.496 ************************************ 00:21:06.496 05:49:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:06.496 05:49:21 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:21:06.496 05:49:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:06.496 05:49:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:06.496 05:49:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:06.496 ************************************ 00:21:06.496 START TEST raid_read_error_test 00:21:06.496 ************************************ 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.TGp3pS4RUS 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1200924 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1200924 /var/tmp/spdk-raid.sock 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1200924 ']' 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:06.496 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:06.496 05:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:06.496 [2024-07-26 05:49:21.375105] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:21:06.496 [2024-07-26 05:49:21.375166] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1200924 ] 00:21:06.756 [2024-07-26 05:49:21.504820] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:06.756 [2024-07-26 05:49:21.613421] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:07.015 [2024-07-26 05:49:21.667192] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:07.015 [2024-07-26 05:49:21.667220] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:07.015 05:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:07.015 05:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:07.015 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:07.015 05:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:07.273 BaseBdev1_malloc 00:21:07.273 05:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:07.532 true 00:21:07.532 05:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:07.791 [2024-07-26 05:49:22.560807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:07.791 [2024-07-26 05:49:22.560855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.791 [2024-07-26 05:49:22.560875] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22b70d0 00:21:07.791 [2024-07-26 05:49:22.560889] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.791 [2024-07-26 05:49:22.562788] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.792 [2024-07-26 05:49:22.562817] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:07.792 BaseBdev1 00:21:07.792 05:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:07.792 05:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:08.050 BaseBdev2_malloc 00:21:08.050 05:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:08.310 true 00:21:08.310 05:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:08.573 [2024-07-26 05:49:23.296534] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:08.573 [2024-07-26 05:49:23.296580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:08.573 [2024-07-26 05:49:23.296601] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22bb910 00:21:08.573 [2024-07-26 05:49:23.296614] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:08.573 [2024-07-26 05:49:23.298230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:08.573 [2024-07-26 05:49:23.298258] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:08.573 BaseBdev2 00:21:08.573 05:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:08.574 05:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:09.144 BaseBdev3_malloc 00:21:09.144 05:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:09.405 true 00:21:09.405 05:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:09.405 [2024-07-26 05:49:24.291711] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:09.405 [2024-07-26 05:49:24.291753] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:09.405 [2024-07-26 05:49:24.291773] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22bdbd0 00:21:09.405 [2024-07-26 05:49:24.291786] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:09.405 [2024-07-26 05:49:24.293347] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:09.405 [2024-07-26 05:49:24.293375] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:09.405 BaseBdev3 00:21:09.405 05:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:09.405 05:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:09.664 BaseBdev4_malloc 00:21:09.664 05:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:09.923 true 00:21:09.923 05:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:10.182 [2024-07-26 05:49:24.889915] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:10.182 [2024-07-26 05:49:24.889958] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:10.182 [2024-07-26 05:49:24.889979] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22beaa0 00:21:10.183 [2024-07-26 05:49:24.889992] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:10.183 [2024-07-26 05:49:24.891596] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:10.183 [2024-07-26 05:49:24.891625] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:10.183 BaseBdev4 00:21:10.183 05:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:10.442 [2024-07-26 05:49:25.122570] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:10.442 [2024-07-26 05:49:25.123946] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:10.442 [2024-07-26 05:49:25.124015] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:10.442 [2024-07-26 05:49:25.124077] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:10.442 [2024-07-26 05:49:25.124313] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x22b8c20 00:21:10.442 [2024-07-26 05:49:25.124325] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:10.442 [2024-07-26 05:49:25.124527] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x210d260 00:21:10.442 [2024-07-26 05:49:25.124695] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22b8c20 00:21:10.442 [2024-07-26 05:49:25.124706] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22b8c20 00:21:10.442 [2024-07-26 05:49:25.124812] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.442 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:10.700 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.700 "name": "raid_bdev1", 00:21:10.700 "uuid": "e18543cd-e617-4b5a-a193-ef980aeb9d8e", 00:21:10.700 "strip_size_kb": 64, 00:21:10.700 "state": "online", 00:21:10.700 "raid_level": "raid0", 00:21:10.700 "superblock": true, 00:21:10.700 "num_base_bdevs": 4, 00:21:10.700 "num_base_bdevs_discovered": 4, 00:21:10.700 "num_base_bdevs_operational": 4, 00:21:10.700 "base_bdevs_list": [ 00:21:10.700 { 00:21:10.700 "name": "BaseBdev1", 00:21:10.700 "uuid": "a27cf128-9eab-5f03-bfd5-e109ab4980fe", 00:21:10.700 "is_configured": true, 00:21:10.700 "data_offset": 2048, 00:21:10.700 "data_size": 63488 00:21:10.700 }, 00:21:10.700 { 00:21:10.700 "name": "BaseBdev2", 00:21:10.700 "uuid": "a308f3af-7556-5d89-ac2e-070c6115a8f6", 00:21:10.700 "is_configured": true, 00:21:10.700 "data_offset": 2048, 00:21:10.700 "data_size": 63488 00:21:10.700 }, 00:21:10.700 { 00:21:10.700 "name": "BaseBdev3", 00:21:10.700 "uuid": "d6624e35-6bc9-59f6-8e45-80097d2ba789", 00:21:10.700 "is_configured": true, 00:21:10.700 "data_offset": 2048, 00:21:10.700 "data_size": 63488 00:21:10.700 }, 00:21:10.700 { 00:21:10.700 "name": "BaseBdev4", 00:21:10.700 "uuid": "4dde7121-e8eb-51f9-951b-9a29ee34dcde", 00:21:10.700 "is_configured": true, 00:21:10.700 "data_offset": 2048, 00:21:10.700 "data_size": 63488 00:21:10.700 } 00:21:10.700 ] 00:21:10.700 }' 00:21:10.700 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.700 05:49:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:11.268 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:11.268 05:49:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:11.268 [2024-07-26 05:49:26.025230] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22aafc0 00:21:12.205 05:49:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.464 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:12.723 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.723 "name": "raid_bdev1", 00:21:12.723 "uuid": "e18543cd-e617-4b5a-a193-ef980aeb9d8e", 00:21:12.723 "strip_size_kb": 64, 00:21:12.723 "state": "online", 00:21:12.723 "raid_level": "raid0", 00:21:12.723 "superblock": true, 00:21:12.723 "num_base_bdevs": 4, 00:21:12.723 "num_base_bdevs_discovered": 4, 00:21:12.723 "num_base_bdevs_operational": 4, 00:21:12.723 "base_bdevs_list": [ 00:21:12.723 { 00:21:12.723 "name": "BaseBdev1", 00:21:12.723 "uuid": "a27cf128-9eab-5f03-bfd5-e109ab4980fe", 00:21:12.723 "is_configured": true, 00:21:12.723 "data_offset": 2048, 00:21:12.723 "data_size": 63488 00:21:12.723 }, 00:21:12.723 { 00:21:12.723 "name": "BaseBdev2", 00:21:12.723 "uuid": "a308f3af-7556-5d89-ac2e-070c6115a8f6", 00:21:12.723 "is_configured": true, 00:21:12.723 "data_offset": 2048, 00:21:12.723 "data_size": 63488 00:21:12.723 }, 00:21:12.723 { 00:21:12.723 "name": "BaseBdev3", 00:21:12.723 "uuid": "d6624e35-6bc9-59f6-8e45-80097d2ba789", 00:21:12.723 "is_configured": true, 00:21:12.723 "data_offset": 2048, 00:21:12.723 "data_size": 63488 00:21:12.723 }, 00:21:12.723 { 00:21:12.723 "name": "BaseBdev4", 00:21:12.723 "uuid": "4dde7121-e8eb-51f9-951b-9a29ee34dcde", 00:21:12.723 "is_configured": true, 00:21:12.723 "data_offset": 2048, 00:21:12.723 "data_size": 63488 00:21:12.723 } 00:21:12.723 ] 00:21:12.723 }' 00:21:12.723 05:49:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.723 05:49:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.291 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:13.291 [2024-07-26 05:49:28.157857] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:13.291 [2024-07-26 05:49:28.157899] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:13.291 [2024-07-26 05:49:28.161115] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:13.291 [2024-07-26 05:49:28.161152] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:13.291 [2024-07-26 05:49:28.161199] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:13.291 [2024-07-26 05:49:28.161210] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b8c20 name raid_bdev1, state offline 00:21:13.291 0 00:21:13.291 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1200924 00:21:13.291 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1200924 ']' 00:21:13.291 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1200924 00:21:13.291 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:21:13.291 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:13.291 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1200924 00:21:13.551 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:13.551 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:13.551 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1200924' 00:21:13.551 killing process with pid 1200924 00:21:13.551 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1200924 00:21:13.551 [2024-07-26 05:49:28.225552] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:13.551 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1200924 00:21:13.551 [2024-07-26 05:49:28.257399] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.TGp3pS4RUS 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:21:13.810 00:21:13.810 real 0m7.190s 00:21:13.810 user 0m11.866s 00:21:13.810 sys 0m1.321s 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:13.810 05:49:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.810 ************************************ 00:21:13.810 END TEST raid_read_error_test 00:21:13.810 ************************************ 00:21:13.810 05:49:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:13.810 05:49:28 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:21:13.810 05:49:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:13.810 05:49:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:13.810 05:49:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:13.810 ************************************ 00:21:13.810 START TEST raid_write_error_test 00:21:13.810 ************************************ 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.eFhnjXWgsf 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1201905 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1201905 /var/tmp/spdk-raid.sock 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1201905 ']' 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:13.810 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:13.810 05:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.810 [2024-07-26 05:49:28.669806] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:21:13.810 [2024-07-26 05:49:28.669881] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1201905 ] 00:21:14.070 [2024-07-26 05:49:28.804983] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:14.070 [2024-07-26 05:49:28.906508] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:14.070 [2024-07-26 05:49:28.966719] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:14.070 [2024-07-26 05:49:28.966763] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:14.638 05:49:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:14.638 05:49:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:14.638 05:49:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:14.638 05:49:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:14.926 BaseBdev1_malloc 00:21:14.926 05:49:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:15.184 true 00:21:15.185 05:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:15.443 [2024-07-26 05:49:30.255355] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:15.443 [2024-07-26 05:49:30.255401] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:15.443 [2024-07-26 05:49:30.255422] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17190d0 00:21:15.443 [2024-07-26 05:49:30.255435] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:15.443 [2024-07-26 05:49:30.257290] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:15.443 [2024-07-26 05:49:30.257322] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:15.443 BaseBdev1 00:21:15.444 05:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:15.444 05:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:15.712 BaseBdev2_malloc 00:21:15.712 05:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:15.974 true 00:21:15.974 05:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:16.232 [2024-07-26 05:49:30.981936] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:16.232 [2024-07-26 05:49:30.981982] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:16.232 [2024-07-26 05:49:30.982003] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x171d910 00:21:16.232 [2024-07-26 05:49:30.982016] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:16.232 [2024-07-26 05:49:30.983521] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:16.232 [2024-07-26 05:49:30.983551] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:16.232 BaseBdev2 00:21:16.232 05:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:16.232 05:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:16.491 BaseBdev3_malloc 00:21:16.491 05:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:16.749 true 00:21:16.749 05:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:17.008 [2024-07-26 05:49:31.720469] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:17.008 [2024-07-26 05:49:31.720515] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:17.008 [2024-07-26 05:49:31.720533] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x171fbd0 00:21:17.008 [2024-07-26 05:49:31.720546] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:17.008 [2024-07-26 05:49:31.721924] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:17.008 [2024-07-26 05:49:31.721956] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:17.008 BaseBdev3 00:21:17.008 05:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:17.008 05:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:17.267 BaseBdev4_malloc 00:21:17.267 05:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:17.267 true 00:21:17.526 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:17.526 [2024-07-26 05:49:32.402819] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:17.526 [2024-07-26 05:49:32.402865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:17.526 [2024-07-26 05:49:32.402885] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1720aa0 00:21:17.526 [2024-07-26 05:49:32.402897] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:17.526 [2024-07-26 05:49:32.404432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:17.526 [2024-07-26 05:49:32.404461] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:17.526 BaseBdev4 00:21:17.526 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:17.786 [2024-07-26 05:49:32.647506] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:17.786 [2024-07-26 05:49:32.648784] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:17.786 [2024-07-26 05:49:32.648854] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:17.786 [2024-07-26 05:49:32.648915] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:17.786 [2024-07-26 05:49:32.649145] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x171ac20 00:21:17.786 [2024-07-26 05:49:32.649156] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:17.786 [2024-07-26 05:49:32.649356] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x156f260 00:21:17.786 [2024-07-26 05:49:32.649506] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x171ac20 00:21:17.786 [2024-07-26 05:49:32.649516] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x171ac20 00:21:17.786 [2024-07-26 05:49:32.649617] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.786 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.045 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.045 "name": "raid_bdev1", 00:21:18.045 "uuid": "a13e99c2-1b40-4a72-bf70-6c718a52ba3e", 00:21:18.045 "strip_size_kb": 64, 00:21:18.045 "state": "online", 00:21:18.045 "raid_level": "raid0", 00:21:18.045 "superblock": true, 00:21:18.045 "num_base_bdevs": 4, 00:21:18.045 "num_base_bdevs_discovered": 4, 00:21:18.045 "num_base_bdevs_operational": 4, 00:21:18.045 "base_bdevs_list": [ 00:21:18.045 { 00:21:18.045 "name": "BaseBdev1", 00:21:18.045 "uuid": "a692160a-7472-5132-900f-77e57bcd691d", 00:21:18.045 "is_configured": true, 00:21:18.045 "data_offset": 2048, 00:21:18.045 "data_size": 63488 00:21:18.045 }, 00:21:18.045 { 00:21:18.045 "name": "BaseBdev2", 00:21:18.045 "uuid": "cee3d894-c5a2-5ae3-aadc-23d5d36730d5", 00:21:18.045 "is_configured": true, 00:21:18.045 "data_offset": 2048, 00:21:18.045 "data_size": 63488 00:21:18.045 }, 00:21:18.045 { 00:21:18.045 "name": "BaseBdev3", 00:21:18.045 "uuid": "5246e5c1-7b75-541a-a5f6-ee19a7fe0956", 00:21:18.045 "is_configured": true, 00:21:18.045 "data_offset": 2048, 00:21:18.045 "data_size": 63488 00:21:18.045 }, 00:21:18.045 { 00:21:18.045 "name": "BaseBdev4", 00:21:18.045 "uuid": "f67ba5dd-ca2f-5fbf-9efc-e5127c4ad3b5", 00:21:18.045 "is_configured": true, 00:21:18.045 "data_offset": 2048, 00:21:18.045 "data_size": 63488 00:21:18.045 } 00:21:18.045 ] 00:21:18.045 }' 00:21:18.045 05:49:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.045 05:49:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:18.612 05:49:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:18.612 05:49:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:18.870 [2024-07-26 05:49:33.618340] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x170cfc0 00:21:19.804 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.062 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.062 "name": "raid_bdev1", 00:21:20.062 "uuid": "a13e99c2-1b40-4a72-bf70-6c718a52ba3e", 00:21:20.062 "strip_size_kb": 64, 00:21:20.062 "state": "online", 00:21:20.062 "raid_level": "raid0", 00:21:20.062 "superblock": true, 00:21:20.062 "num_base_bdevs": 4, 00:21:20.062 "num_base_bdevs_discovered": 4, 00:21:20.062 "num_base_bdevs_operational": 4, 00:21:20.062 "base_bdevs_list": [ 00:21:20.062 { 00:21:20.062 "name": "BaseBdev1", 00:21:20.062 "uuid": "a692160a-7472-5132-900f-77e57bcd691d", 00:21:20.062 "is_configured": true, 00:21:20.063 "data_offset": 2048, 00:21:20.063 "data_size": 63488 00:21:20.063 }, 00:21:20.063 { 00:21:20.063 "name": "BaseBdev2", 00:21:20.063 "uuid": "cee3d894-c5a2-5ae3-aadc-23d5d36730d5", 00:21:20.063 "is_configured": true, 00:21:20.063 "data_offset": 2048, 00:21:20.063 "data_size": 63488 00:21:20.063 }, 00:21:20.063 { 00:21:20.063 "name": "BaseBdev3", 00:21:20.063 "uuid": "5246e5c1-7b75-541a-a5f6-ee19a7fe0956", 00:21:20.063 "is_configured": true, 00:21:20.063 "data_offset": 2048, 00:21:20.063 "data_size": 63488 00:21:20.063 }, 00:21:20.063 { 00:21:20.063 "name": "BaseBdev4", 00:21:20.063 "uuid": "f67ba5dd-ca2f-5fbf-9efc-e5127c4ad3b5", 00:21:20.063 "is_configured": true, 00:21:20.063 "data_offset": 2048, 00:21:20.063 "data_size": 63488 00:21:20.063 } 00:21:20.063 ] 00:21:20.063 }' 00:21:20.063 05:49:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.063 05:49:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:20.999 05:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:20.999 [2024-07-26 05:49:35.774207] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:20.999 [2024-07-26 05:49:35.774245] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:20.999 [2024-07-26 05:49:35.777419] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:21.000 [2024-07-26 05:49:35.777459] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:21.000 [2024-07-26 05:49:35.777501] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:21.000 [2024-07-26 05:49:35.777512] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x171ac20 name raid_bdev1, state offline 00:21:21.000 0 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1201905 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1201905 ']' 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1201905 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1201905 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1201905' 00:21:21.000 killing process with pid 1201905 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1201905 00:21:21.000 [2024-07-26 05:49:35.847012] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:21.000 05:49:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1201905 00:21:21.000 [2024-07-26 05:49:35.879442] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:21.258 05:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.eFhnjXWgsf 00:21:21.258 05:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:21.258 05:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:21.258 05:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:21:21.258 05:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:21:21.259 05:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:21.259 05:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:21.259 05:49:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:21:21.259 00:21:21.259 real 0m7.535s 00:21:21.259 user 0m12.051s 00:21:21.259 sys 0m1.319s 00:21:21.259 05:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:21.259 05:49:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:21.259 ************************************ 00:21:21.259 END TEST raid_write_error_test 00:21:21.259 ************************************ 00:21:21.518 05:49:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:21.518 05:49:36 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:21:21.518 05:49:36 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:21:21.518 05:49:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:21.518 05:49:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:21.518 05:49:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:21.518 ************************************ 00:21:21.518 START TEST raid_state_function_test 00:21:21.518 ************************************ 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1203059 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1203059' 00:21:21.518 Process raid pid: 1203059 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1203059 /var/tmp/spdk-raid.sock 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1203059 ']' 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:21.518 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:21.518 05:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:21.518 [2024-07-26 05:49:36.288258] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:21:21.518 [2024-07-26 05:49:36.288329] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:21.518 [2024-07-26 05:49:36.418283] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:21.777 [2024-07-26 05:49:36.515576] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:21.777 [2024-07-26 05:49:36.584270] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:21.777 [2024-07-26 05:49:36.584306] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:22.344 05:49:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:22.344 05:49:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:21:22.344 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:22.602 [2024-07-26 05:49:37.446623] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:22.602 [2024-07-26 05:49:37.446671] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:22.602 [2024-07-26 05:49:37.446682] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:22.602 [2024-07-26 05:49:37.446694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:22.602 [2024-07-26 05:49:37.446703] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:22.602 [2024-07-26 05:49:37.446714] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:22.602 [2024-07-26 05:49:37.446722] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:22.602 [2024-07-26 05:49:37.446733] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.602 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:22.861 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.861 "name": "Existed_Raid", 00:21:22.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.861 "strip_size_kb": 64, 00:21:22.861 "state": "configuring", 00:21:22.861 "raid_level": "concat", 00:21:22.861 "superblock": false, 00:21:22.861 "num_base_bdevs": 4, 00:21:22.861 "num_base_bdevs_discovered": 0, 00:21:22.861 "num_base_bdevs_operational": 4, 00:21:22.861 "base_bdevs_list": [ 00:21:22.861 { 00:21:22.862 "name": "BaseBdev1", 00:21:22.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.862 "is_configured": false, 00:21:22.862 "data_offset": 0, 00:21:22.862 "data_size": 0 00:21:22.862 }, 00:21:22.862 { 00:21:22.862 "name": "BaseBdev2", 00:21:22.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.862 "is_configured": false, 00:21:22.862 "data_offset": 0, 00:21:22.862 "data_size": 0 00:21:22.862 }, 00:21:22.862 { 00:21:22.862 "name": "BaseBdev3", 00:21:22.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.862 "is_configured": false, 00:21:22.862 "data_offset": 0, 00:21:22.862 "data_size": 0 00:21:22.862 }, 00:21:22.862 { 00:21:22.862 "name": "BaseBdev4", 00:21:22.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:22.862 "is_configured": false, 00:21:22.862 "data_offset": 0, 00:21:22.862 "data_size": 0 00:21:22.862 } 00:21:22.862 ] 00:21:22.862 }' 00:21:22.862 05:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.862 05:49:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:23.429 05:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:23.687 [2024-07-26 05:49:38.553398] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:23.687 [2024-07-26 05:49:38.553430] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d6aa0 name Existed_Raid, state configuring 00:21:23.687 05:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:23.946 [2024-07-26 05:49:38.802084] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:23.946 [2024-07-26 05:49:38.802113] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:23.946 [2024-07-26 05:49:38.802122] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:23.946 [2024-07-26 05:49:38.802133] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:23.946 [2024-07-26 05:49:38.802142] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:23.946 [2024-07-26 05:49:38.802153] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:23.946 [2024-07-26 05:49:38.802161] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:23.946 [2024-07-26 05:49:38.802172] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:23.946 05:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:24.205 [2024-07-26 05:49:39.056587] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:24.205 BaseBdev1 00:21:24.205 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:24.205 05:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:24.205 05:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:24.205 05:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:24.205 05:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:24.205 05:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:24.205 05:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:24.464 05:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:24.723 [ 00:21:24.723 { 00:21:24.723 "name": "BaseBdev1", 00:21:24.723 "aliases": [ 00:21:24.723 "70337620-fc39-4fff-94ef-98837fcf11ba" 00:21:24.723 ], 00:21:24.723 "product_name": "Malloc disk", 00:21:24.723 "block_size": 512, 00:21:24.723 "num_blocks": 65536, 00:21:24.723 "uuid": "70337620-fc39-4fff-94ef-98837fcf11ba", 00:21:24.723 "assigned_rate_limits": { 00:21:24.723 "rw_ios_per_sec": 0, 00:21:24.723 "rw_mbytes_per_sec": 0, 00:21:24.723 "r_mbytes_per_sec": 0, 00:21:24.723 "w_mbytes_per_sec": 0 00:21:24.723 }, 00:21:24.723 "claimed": true, 00:21:24.723 "claim_type": "exclusive_write", 00:21:24.723 "zoned": false, 00:21:24.723 "supported_io_types": { 00:21:24.723 "read": true, 00:21:24.723 "write": true, 00:21:24.723 "unmap": true, 00:21:24.723 "flush": true, 00:21:24.723 "reset": true, 00:21:24.723 "nvme_admin": false, 00:21:24.723 "nvme_io": false, 00:21:24.723 "nvme_io_md": false, 00:21:24.723 "write_zeroes": true, 00:21:24.723 "zcopy": true, 00:21:24.723 "get_zone_info": false, 00:21:24.723 "zone_management": false, 00:21:24.723 "zone_append": false, 00:21:24.723 "compare": false, 00:21:24.723 "compare_and_write": false, 00:21:24.723 "abort": true, 00:21:24.723 "seek_hole": false, 00:21:24.723 "seek_data": false, 00:21:24.723 "copy": true, 00:21:24.723 "nvme_iov_md": false 00:21:24.723 }, 00:21:24.723 "memory_domains": [ 00:21:24.723 { 00:21:24.723 "dma_device_id": "system", 00:21:24.723 "dma_device_type": 1 00:21:24.723 }, 00:21:24.723 { 00:21:24.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.723 "dma_device_type": 2 00:21:24.723 } 00:21:24.723 ], 00:21:24.723 "driver_specific": {} 00:21:24.723 } 00:21:24.723 ] 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.723 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:24.982 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:24.982 "name": "Existed_Raid", 00:21:24.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:24.982 "strip_size_kb": 64, 00:21:24.982 "state": "configuring", 00:21:24.982 "raid_level": "concat", 00:21:24.982 "superblock": false, 00:21:24.982 "num_base_bdevs": 4, 00:21:24.982 "num_base_bdevs_discovered": 1, 00:21:24.982 "num_base_bdevs_operational": 4, 00:21:24.982 "base_bdevs_list": [ 00:21:24.982 { 00:21:24.982 "name": "BaseBdev1", 00:21:24.982 "uuid": "70337620-fc39-4fff-94ef-98837fcf11ba", 00:21:24.982 "is_configured": true, 00:21:24.982 "data_offset": 0, 00:21:24.982 "data_size": 65536 00:21:24.982 }, 00:21:24.982 { 00:21:24.982 "name": "BaseBdev2", 00:21:24.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:24.982 "is_configured": false, 00:21:24.982 "data_offset": 0, 00:21:24.982 "data_size": 0 00:21:24.982 }, 00:21:24.982 { 00:21:24.982 "name": "BaseBdev3", 00:21:24.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:24.982 "is_configured": false, 00:21:24.982 "data_offset": 0, 00:21:24.982 "data_size": 0 00:21:24.982 }, 00:21:24.982 { 00:21:24.982 "name": "BaseBdev4", 00:21:24.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:24.982 "is_configured": false, 00:21:24.982 "data_offset": 0, 00:21:24.982 "data_size": 0 00:21:24.982 } 00:21:24.982 ] 00:21:24.982 }' 00:21:24.982 05:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:24.982 05:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:25.548 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:25.808 [2024-07-26 05:49:40.640893] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:25.808 [2024-07-26 05:49:40.640932] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d6310 name Existed_Raid, state configuring 00:21:25.808 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:26.066 [2024-07-26 05:49:40.813386] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:26.066 [2024-07-26 05:49:40.814840] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:26.066 [2024-07-26 05:49:40.814875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:26.066 [2024-07-26 05:49:40.814885] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:26.066 [2024-07-26 05:49:40.814897] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:26.066 [2024-07-26 05:49:40.814906] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:26.066 [2024-07-26 05:49:40.814916] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.066 05:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:26.325 05:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:26.325 "name": "Existed_Raid", 00:21:26.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.325 "strip_size_kb": 64, 00:21:26.325 "state": "configuring", 00:21:26.325 "raid_level": "concat", 00:21:26.325 "superblock": false, 00:21:26.325 "num_base_bdevs": 4, 00:21:26.325 "num_base_bdevs_discovered": 1, 00:21:26.325 "num_base_bdevs_operational": 4, 00:21:26.325 "base_bdevs_list": [ 00:21:26.325 { 00:21:26.325 "name": "BaseBdev1", 00:21:26.325 "uuid": "70337620-fc39-4fff-94ef-98837fcf11ba", 00:21:26.325 "is_configured": true, 00:21:26.325 "data_offset": 0, 00:21:26.325 "data_size": 65536 00:21:26.325 }, 00:21:26.325 { 00:21:26.325 "name": "BaseBdev2", 00:21:26.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.325 "is_configured": false, 00:21:26.325 "data_offset": 0, 00:21:26.325 "data_size": 0 00:21:26.325 }, 00:21:26.325 { 00:21:26.325 "name": "BaseBdev3", 00:21:26.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.325 "is_configured": false, 00:21:26.325 "data_offset": 0, 00:21:26.325 "data_size": 0 00:21:26.325 }, 00:21:26.325 { 00:21:26.325 "name": "BaseBdev4", 00:21:26.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.325 "is_configured": false, 00:21:26.325 "data_offset": 0, 00:21:26.325 "data_size": 0 00:21:26.325 } 00:21:26.325 ] 00:21:26.325 }' 00:21:26.325 05:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:26.325 05:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:26.891 05:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:27.149 [2024-07-26 05:49:41.867619] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:27.149 BaseBdev2 00:21:27.149 05:49:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:27.149 05:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:27.149 05:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:27.149 05:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:27.149 05:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:27.149 05:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:27.149 05:49:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:27.408 05:49:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:27.408 [ 00:21:27.408 { 00:21:27.408 "name": "BaseBdev2", 00:21:27.408 "aliases": [ 00:21:27.408 "1b1b267c-6838-4d6d-b21c-52b7b61197e7" 00:21:27.408 ], 00:21:27.408 "product_name": "Malloc disk", 00:21:27.408 "block_size": 512, 00:21:27.408 "num_blocks": 65536, 00:21:27.408 "uuid": "1b1b267c-6838-4d6d-b21c-52b7b61197e7", 00:21:27.408 "assigned_rate_limits": { 00:21:27.408 "rw_ios_per_sec": 0, 00:21:27.408 "rw_mbytes_per_sec": 0, 00:21:27.408 "r_mbytes_per_sec": 0, 00:21:27.408 "w_mbytes_per_sec": 0 00:21:27.408 }, 00:21:27.408 "claimed": true, 00:21:27.408 "claim_type": "exclusive_write", 00:21:27.408 "zoned": false, 00:21:27.408 "supported_io_types": { 00:21:27.408 "read": true, 00:21:27.408 "write": true, 00:21:27.408 "unmap": true, 00:21:27.408 "flush": true, 00:21:27.408 "reset": true, 00:21:27.408 "nvme_admin": false, 00:21:27.408 "nvme_io": false, 00:21:27.408 "nvme_io_md": false, 00:21:27.408 "write_zeroes": true, 00:21:27.408 "zcopy": true, 00:21:27.408 "get_zone_info": false, 00:21:27.408 "zone_management": false, 00:21:27.408 "zone_append": false, 00:21:27.408 "compare": false, 00:21:27.408 "compare_and_write": false, 00:21:27.408 "abort": true, 00:21:27.408 "seek_hole": false, 00:21:27.408 "seek_data": false, 00:21:27.408 "copy": true, 00:21:27.408 "nvme_iov_md": false 00:21:27.408 }, 00:21:27.408 "memory_domains": [ 00:21:27.408 { 00:21:27.408 "dma_device_id": "system", 00:21:27.408 "dma_device_type": 1 00:21:27.408 }, 00:21:27.408 { 00:21:27.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.408 "dma_device_type": 2 00:21:27.408 } 00:21:27.408 ], 00:21:27.408 "driver_specific": {} 00:21:27.408 } 00:21:27.408 ] 00:21:27.408 05:49:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:27.408 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:27.408 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:27.408 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:27.408 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:27.408 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:27.409 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:27.409 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:27.409 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:27.409 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:27.409 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:27.409 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:27.409 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:27.409 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.409 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:27.667 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.667 "name": "Existed_Raid", 00:21:27.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.667 "strip_size_kb": 64, 00:21:27.667 "state": "configuring", 00:21:27.667 "raid_level": "concat", 00:21:27.667 "superblock": false, 00:21:27.667 "num_base_bdevs": 4, 00:21:27.667 "num_base_bdevs_discovered": 2, 00:21:27.667 "num_base_bdevs_operational": 4, 00:21:27.667 "base_bdevs_list": [ 00:21:27.667 { 00:21:27.667 "name": "BaseBdev1", 00:21:27.667 "uuid": "70337620-fc39-4fff-94ef-98837fcf11ba", 00:21:27.667 "is_configured": true, 00:21:27.667 "data_offset": 0, 00:21:27.667 "data_size": 65536 00:21:27.667 }, 00:21:27.667 { 00:21:27.667 "name": "BaseBdev2", 00:21:27.667 "uuid": "1b1b267c-6838-4d6d-b21c-52b7b61197e7", 00:21:27.667 "is_configured": true, 00:21:27.667 "data_offset": 0, 00:21:27.667 "data_size": 65536 00:21:27.667 }, 00:21:27.667 { 00:21:27.667 "name": "BaseBdev3", 00:21:27.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.667 "is_configured": false, 00:21:27.667 "data_offset": 0, 00:21:27.667 "data_size": 0 00:21:27.667 }, 00:21:27.667 { 00:21:27.667 "name": "BaseBdev4", 00:21:27.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.667 "is_configured": false, 00:21:27.667 "data_offset": 0, 00:21:27.667 "data_size": 0 00:21:27.667 } 00:21:27.667 ] 00:21:27.667 }' 00:21:27.667 05:49:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.667 05:49:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:28.234 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:28.493 [2024-07-26 05:49:43.242733] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:28.493 BaseBdev3 00:21:28.493 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:28.493 05:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:28.493 05:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:28.493 05:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:28.493 05:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:28.493 05:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:28.493 05:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:28.751 05:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:29.012 [ 00:21:29.012 { 00:21:29.012 "name": "BaseBdev3", 00:21:29.012 "aliases": [ 00:21:29.012 "ba5bf631-46dc-4ca1-840e-d9f8192e6f50" 00:21:29.012 ], 00:21:29.012 "product_name": "Malloc disk", 00:21:29.012 "block_size": 512, 00:21:29.012 "num_blocks": 65536, 00:21:29.012 "uuid": "ba5bf631-46dc-4ca1-840e-d9f8192e6f50", 00:21:29.012 "assigned_rate_limits": { 00:21:29.012 "rw_ios_per_sec": 0, 00:21:29.012 "rw_mbytes_per_sec": 0, 00:21:29.012 "r_mbytes_per_sec": 0, 00:21:29.012 "w_mbytes_per_sec": 0 00:21:29.012 }, 00:21:29.012 "claimed": true, 00:21:29.012 "claim_type": "exclusive_write", 00:21:29.012 "zoned": false, 00:21:29.012 "supported_io_types": { 00:21:29.012 "read": true, 00:21:29.012 "write": true, 00:21:29.012 "unmap": true, 00:21:29.012 "flush": true, 00:21:29.012 "reset": true, 00:21:29.012 "nvme_admin": false, 00:21:29.012 "nvme_io": false, 00:21:29.012 "nvme_io_md": false, 00:21:29.012 "write_zeroes": true, 00:21:29.012 "zcopy": true, 00:21:29.012 "get_zone_info": false, 00:21:29.012 "zone_management": false, 00:21:29.012 "zone_append": false, 00:21:29.012 "compare": false, 00:21:29.012 "compare_and_write": false, 00:21:29.012 "abort": true, 00:21:29.012 "seek_hole": false, 00:21:29.012 "seek_data": false, 00:21:29.012 "copy": true, 00:21:29.012 "nvme_iov_md": false 00:21:29.012 }, 00:21:29.012 "memory_domains": [ 00:21:29.012 { 00:21:29.012 "dma_device_id": "system", 00:21:29.012 "dma_device_type": 1 00:21:29.012 }, 00:21:29.012 { 00:21:29.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:29.012 "dma_device_type": 2 00:21:29.012 } 00:21:29.012 ], 00:21:29.012 "driver_specific": {} 00:21:29.012 } 00:21:29.012 ] 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.012 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:29.311 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:29.311 "name": "Existed_Raid", 00:21:29.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:29.311 "strip_size_kb": 64, 00:21:29.311 "state": "configuring", 00:21:29.311 "raid_level": "concat", 00:21:29.311 "superblock": false, 00:21:29.311 "num_base_bdevs": 4, 00:21:29.311 "num_base_bdevs_discovered": 3, 00:21:29.311 "num_base_bdevs_operational": 4, 00:21:29.311 "base_bdevs_list": [ 00:21:29.311 { 00:21:29.311 "name": "BaseBdev1", 00:21:29.311 "uuid": "70337620-fc39-4fff-94ef-98837fcf11ba", 00:21:29.311 "is_configured": true, 00:21:29.311 "data_offset": 0, 00:21:29.311 "data_size": 65536 00:21:29.311 }, 00:21:29.311 { 00:21:29.311 "name": "BaseBdev2", 00:21:29.311 "uuid": "1b1b267c-6838-4d6d-b21c-52b7b61197e7", 00:21:29.311 "is_configured": true, 00:21:29.311 "data_offset": 0, 00:21:29.311 "data_size": 65536 00:21:29.311 }, 00:21:29.311 { 00:21:29.311 "name": "BaseBdev3", 00:21:29.311 "uuid": "ba5bf631-46dc-4ca1-840e-d9f8192e6f50", 00:21:29.311 "is_configured": true, 00:21:29.311 "data_offset": 0, 00:21:29.311 "data_size": 65536 00:21:29.311 }, 00:21:29.311 { 00:21:29.311 "name": "BaseBdev4", 00:21:29.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:29.311 "is_configured": false, 00:21:29.311 "data_offset": 0, 00:21:29.311 "data_size": 0 00:21:29.311 } 00:21:29.311 ] 00:21:29.311 }' 00:21:29.311 05:49:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:29.311 05:49:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:29.570 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:29.829 [2024-07-26 05:49:44.629866] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:29.829 [2024-07-26 05:49:44.629904] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12d7350 00:21:29.829 [2024-07-26 05:49:44.629912] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:21:29.829 [2024-07-26 05:49:44.630166] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12d7020 00:21:29.829 [2024-07-26 05:49:44.630289] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12d7350 00:21:29.829 [2024-07-26 05:49:44.630299] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12d7350 00:21:29.829 [2024-07-26 05:49:44.630464] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.829 BaseBdev4 00:21:29.829 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:29.829 05:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:29.829 05:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:29.829 05:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:29.829 05:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:29.829 05:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:29.829 05:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:30.088 [ 00:21:30.088 { 00:21:30.088 "name": "BaseBdev4", 00:21:30.088 "aliases": [ 00:21:30.088 "b47de32c-b4a2-478b-ac06-c3856cfc4eca" 00:21:30.088 ], 00:21:30.088 "product_name": "Malloc disk", 00:21:30.088 "block_size": 512, 00:21:30.088 "num_blocks": 65536, 00:21:30.088 "uuid": "b47de32c-b4a2-478b-ac06-c3856cfc4eca", 00:21:30.088 "assigned_rate_limits": { 00:21:30.088 "rw_ios_per_sec": 0, 00:21:30.088 "rw_mbytes_per_sec": 0, 00:21:30.088 "r_mbytes_per_sec": 0, 00:21:30.088 "w_mbytes_per_sec": 0 00:21:30.088 }, 00:21:30.088 "claimed": true, 00:21:30.088 "claim_type": "exclusive_write", 00:21:30.088 "zoned": false, 00:21:30.088 "supported_io_types": { 00:21:30.088 "read": true, 00:21:30.088 "write": true, 00:21:30.088 "unmap": true, 00:21:30.088 "flush": true, 00:21:30.088 "reset": true, 00:21:30.088 "nvme_admin": false, 00:21:30.088 "nvme_io": false, 00:21:30.088 "nvme_io_md": false, 00:21:30.088 "write_zeroes": true, 00:21:30.088 "zcopy": true, 00:21:30.088 "get_zone_info": false, 00:21:30.088 "zone_management": false, 00:21:30.088 "zone_append": false, 00:21:30.088 "compare": false, 00:21:30.088 "compare_and_write": false, 00:21:30.088 "abort": true, 00:21:30.088 "seek_hole": false, 00:21:30.088 "seek_data": false, 00:21:30.088 "copy": true, 00:21:30.088 "nvme_iov_md": false 00:21:30.088 }, 00:21:30.088 "memory_domains": [ 00:21:30.088 { 00:21:30.088 "dma_device_id": "system", 00:21:30.088 "dma_device_type": 1 00:21:30.088 }, 00:21:30.088 { 00:21:30.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.088 "dma_device_type": 2 00:21:30.088 } 00:21:30.088 ], 00:21:30.088 "driver_specific": {} 00:21:30.088 } 00:21:30.088 ] 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.088 05:49:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:30.362 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.362 "name": "Existed_Raid", 00:21:30.362 "uuid": "bdf8063b-0e87-4cff-abc0-35c6b4774b1d", 00:21:30.362 "strip_size_kb": 64, 00:21:30.362 "state": "online", 00:21:30.362 "raid_level": "concat", 00:21:30.362 "superblock": false, 00:21:30.362 "num_base_bdevs": 4, 00:21:30.362 "num_base_bdevs_discovered": 4, 00:21:30.362 "num_base_bdevs_operational": 4, 00:21:30.362 "base_bdevs_list": [ 00:21:30.362 { 00:21:30.362 "name": "BaseBdev1", 00:21:30.362 "uuid": "70337620-fc39-4fff-94ef-98837fcf11ba", 00:21:30.362 "is_configured": true, 00:21:30.362 "data_offset": 0, 00:21:30.362 "data_size": 65536 00:21:30.362 }, 00:21:30.362 { 00:21:30.362 "name": "BaseBdev2", 00:21:30.362 "uuid": "1b1b267c-6838-4d6d-b21c-52b7b61197e7", 00:21:30.362 "is_configured": true, 00:21:30.362 "data_offset": 0, 00:21:30.362 "data_size": 65536 00:21:30.362 }, 00:21:30.362 { 00:21:30.362 "name": "BaseBdev3", 00:21:30.362 "uuid": "ba5bf631-46dc-4ca1-840e-d9f8192e6f50", 00:21:30.362 "is_configured": true, 00:21:30.362 "data_offset": 0, 00:21:30.362 "data_size": 65536 00:21:30.362 }, 00:21:30.362 { 00:21:30.362 "name": "BaseBdev4", 00:21:30.362 "uuid": "b47de32c-b4a2-478b-ac06-c3856cfc4eca", 00:21:30.362 "is_configured": true, 00:21:30.362 "data_offset": 0, 00:21:30.362 "data_size": 65536 00:21:30.362 } 00:21:30.362 ] 00:21:30.362 }' 00:21:30.362 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.362 05:49:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:30.930 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:30.930 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:30.930 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:30.930 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:30.930 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:30.930 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:30.930 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:30.930 05:49:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:31.190 [2024-07-26 05:49:46.045942] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:31.190 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:31.190 "name": "Existed_Raid", 00:21:31.190 "aliases": [ 00:21:31.190 "bdf8063b-0e87-4cff-abc0-35c6b4774b1d" 00:21:31.190 ], 00:21:31.190 "product_name": "Raid Volume", 00:21:31.190 "block_size": 512, 00:21:31.190 "num_blocks": 262144, 00:21:31.190 "uuid": "bdf8063b-0e87-4cff-abc0-35c6b4774b1d", 00:21:31.190 "assigned_rate_limits": { 00:21:31.190 "rw_ios_per_sec": 0, 00:21:31.190 "rw_mbytes_per_sec": 0, 00:21:31.190 "r_mbytes_per_sec": 0, 00:21:31.190 "w_mbytes_per_sec": 0 00:21:31.190 }, 00:21:31.190 "claimed": false, 00:21:31.190 "zoned": false, 00:21:31.190 "supported_io_types": { 00:21:31.190 "read": true, 00:21:31.190 "write": true, 00:21:31.190 "unmap": true, 00:21:31.190 "flush": true, 00:21:31.190 "reset": true, 00:21:31.190 "nvme_admin": false, 00:21:31.190 "nvme_io": false, 00:21:31.190 "nvme_io_md": false, 00:21:31.190 "write_zeroes": true, 00:21:31.190 "zcopy": false, 00:21:31.190 "get_zone_info": false, 00:21:31.190 "zone_management": false, 00:21:31.190 "zone_append": false, 00:21:31.190 "compare": false, 00:21:31.190 "compare_and_write": false, 00:21:31.190 "abort": false, 00:21:31.190 "seek_hole": false, 00:21:31.190 "seek_data": false, 00:21:31.190 "copy": false, 00:21:31.190 "nvme_iov_md": false 00:21:31.190 }, 00:21:31.190 "memory_domains": [ 00:21:31.190 { 00:21:31.190 "dma_device_id": "system", 00:21:31.190 "dma_device_type": 1 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.190 "dma_device_type": 2 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "dma_device_id": "system", 00:21:31.190 "dma_device_type": 1 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.190 "dma_device_type": 2 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "dma_device_id": "system", 00:21:31.190 "dma_device_type": 1 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.190 "dma_device_type": 2 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "dma_device_id": "system", 00:21:31.190 "dma_device_type": 1 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.190 "dma_device_type": 2 00:21:31.190 } 00:21:31.190 ], 00:21:31.190 "driver_specific": { 00:21:31.190 "raid": { 00:21:31.190 "uuid": "bdf8063b-0e87-4cff-abc0-35c6b4774b1d", 00:21:31.190 "strip_size_kb": 64, 00:21:31.190 "state": "online", 00:21:31.190 "raid_level": "concat", 00:21:31.190 "superblock": false, 00:21:31.190 "num_base_bdevs": 4, 00:21:31.190 "num_base_bdevs_discovered": 4, 00:21:31.190 "num_base_bdevs_operational": 4, 00:21:31.190 "base_bdevs_list": [ 00:21:31.190 { 00:21:31.190 "name": "BaseBdev1", 00:21:31.190 "uuid": "70337620-fc39-4fff-94ef-98837fcf11ba", 00:21:31.190 "is_configured": true, 00:21:31.190 "data_offset": 0, 00:21:31.190 "data_size": 65536 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "name": "BaseBdev2", 00:21:31.190 "uuid": "1b1b267c-6838-4d6d-b21c-52b7b61197e7", 00:21:31.190 "is_configured": true, 00:21:31.190 "data_offset": 0, 00:21:31.190 "data_size": 65536 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "name": "BaseBdev3", 00:21:31.190 "uuid": "ba5bf631-46dc-4ca1-840e-d9f8192e6f50", 00:21:31.190 "is_configured": true, 00:21:31.190 "data_offset": 0, 00:21:31.190 "data_size": 65536 00:21:31.190 }, 00:21:31.190 { 00:21:31.190 "name": "BaseBdev4", 00:21:31.190 "uuid": "b47de32c-b4a2-478b-ac06-c3856cfc4eca", 00:21:31.190 "is_configured": true, 00:21:31.190 "data_offset": 0, 00:21:31.190 "data_size": 65536 00:21:31.190 } 00:21:31.190 ] 00:21:31.190 } 00:21:31.190 } 00:21:31.190 }' 00:21:31.190 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:31.449 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:31.449 BaseBdev2 00:21:31.449 BaseBdev3 00:21:31.450 BaseBdev4' 00:21:31.450 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:31.450 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:31.450 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:31.709 "name": "BaseBdev1", 00:21:31.709 "aliases": [ 00:21:31.709 "70337620-fc39-4fff-94ef-98837fcf11ba" 00:21:31.709 ], 00:21:31.709 "product_name": "Malloc disk", 00:21:31.709 "block_size": 512, 00:21:31.709 "num_blocks": 65536, 00:21:31.709 "uuid": "70337620-fc39-4fff-94ef-98837fcf11ba", 00:21:31.709 "assigned_rate_limits": { 00:21:31.709 "rw_ios_per_sec": 0, 00:21:31.709 "rw_mbytes_per_sec": 0, 00:21:31.709 "r_mbytes_per_sec": 0, 00:21:31.709 "w_mbytes_per_sec": 0 00:21:31.709 }, 00:21:31.709 "claimed": true, 00:21:31.709 "claim_type": "exclusive_write", 00:21:31.709 "zoned": false, 00:21:31.709 "supported_io_types": { 00:21:31.709 "read": true, 00:21:31.709 "write": true, 00:21:31.709 "unmap": true, 00:21:31.709 "flush": true, 00:21:31.709 "reset": true, 00:21:31.709 "nvme_admin": false, 00:21:31.709 "nvme_io": false, 00:21:31.709 "nvme_io_md": false, 00:21:31.709 "write_zeroes": true, 00:21:31.709 "zcopy": true, 00:21:31.709 "get_zone_info": false, 00:21:31.709 "zone_management": false, 00:21:31.709 "zone_append": false, 00:21:31.709 "compare": false, 00:21:31.709 "compare_and_write": false, 00:21:31.709 "abort": true, 00:21:31.709 "seek_hole": false, 00:21:31.709 "seek_data": false, 00:21:31.709 "copy": true, 00:21:31.709 "nvme_iov_md": false 00:21:31.709 }, 00:21:31.709 "memory_domains": [ 00:21:31.709 { 00:21:31.709 "dma_device_id": "system", 00:21:31.709 "dma_device_type": 1 00:21:31.709 }, 00:21:31.709 { 00:21:31.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.709 "dma_device_type": 2 00:21:31.709 } 00:21:31.709 ], 00:21:31.709 "driver_specific": {} 00:21:31.709 }' 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:31.709 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:31.968 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:31.968 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:31.968 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:31.968 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:31.968 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:31.968 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:31.968 "name": "BaseBdev2", 00:21:31.968 "aliases": [ 00:21:31.968 "1b1b267c-6838-4d6d-b21c-52b7b61197e7" 00:21:31.968 ], 00:21:31.968 "product_name": "Malloc disk", 00:21:31.968 "block_size": 512, 00:21:31.968 "num_blocks": 65536, 00:21:31.968 "uuid": "1b1b267c-6838-4d6d-b21c-52b7b61197e7", 00:21:31.968 "assigned_rate_limits": { 00:21:31.968 "rw_ios_per_sec": 0, 00:21:31.968 "rw_mbytes_per_sec": 0, 00:21:31.968 "r_mbytes_per_sec": 0, 00:21:31.968 "w_mbytes_per_sec": 0 00:21:31.968 }, 00:21:31.968 "claimed": true, 00:21:31.968 "claim_type": "exclusive_write", 00:21:31.968 "zoned": false, 00:21:31.968 "supported_io_types": { 00:21:31.968 "read": true, 00:21:31.968 "write": true, 00:21:31.968 "unmap": true, 00:21:31.968 "flush": true, 00:21:31.968 "reset": true, 00:21:31.968 "nvme_admin": false, 00:21:31.968 "nvme_io": false, 00:21:31.968 "nvme_io_md": false, 00:21:31.968 "write_zeroes": true, 00:21:31.968 "zcopy": true, 00:21:31.968 "get_zone_info": false, 00:21:31.968 "zone_management": false, 00:21:31.968 "zone_append": false, 00:21:31.968 "compare": false, 00:21:31.968 "compare_and_write": false, 00:21:31.968 "abort": true, 00:21:31.968 "seek_hole": false, 00:21:31.968 "seek_data": false, 00:21:31.968 "copy": true, 00:21:31.968 "nvme_iov_md": false 00:21:31.968 }, 00:21:31.968 "memory_domains": [ 00:21:31.968 { 00:21:31.968 "dma_device_id": "system", 00:21:31.968 "dma_device_type": 1 00:21:31.968 }, 00:21:31.968 { 00:21:31.968 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.968 "dma_device_type": 2 00:21:31.968 } 00:21:31.968 ], 00:21:31.968 "driver_specific": {} 00:21:31.968 }' 00:21:31.968 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.227 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.227 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:32.227 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.227 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.227 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:32.227 05:49:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.227 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.227 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:32.227 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.486 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.486 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:32.486 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:32.486 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:32.486 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:32.745 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:32.745 "name": "BaseBdev3", 00:21:32.745 "aliases": [ 00:21:32.745 "ba5bf631-46dc-4ca1-840e-d9f8192e6f50" 00:21:32.745 ], 00:21:32.745 "product_name": "Malloc disk", 00:21:32.745 "block_size": 512, 00:21:32.745 "num_blocks": 65536, 00:21:32.745 "uuid": "ba5bf631-46dc-4ca1-840e-d9f8192e6f50", 00:21:32.745 "assigned_rate_limits": { 00:21:32.745 "rw_ios_per_sec": 0, 00:21:32.745 "rw_mbytes_per_sec": 0, 00:21:32.745 "r_mbytes_per_sec": 0, 00:21:32.745 "w_mbytes_per_sec": 0 00:21:32.745 }, 00:21:32.745 "claimed": true, 00:21:32.745 "claim_type": "exclusive_write", 00:21:32.745 "zoned": false, 00:21:32.745 "supported_io_types": { 00:21:32.745 "read": true, 00:21:32.745 "write": true, 00:21:32.745 "unmap": true, 00:21:32.745 "flush": true, 00:21:32.745 "reset": true, 00:21:32.745 "nvme_admin": false, 00:21:32.745 "nvme_io": false, 00:21:32.745 "nvme_io_md": false, 00:21:32.745 "write_zeroes": true, 00:21:32.745 "zcopy": true, 00:21:32.745 "get_zone_info": false, 00:21:32.745 "zone_management": false, 00:21:32.745 "zone_append": false, 00:21:32.745 "compare": false, 00:21:32.745 "compare_and_write": false, 00:21:32.745 "abort": true, 00:21:32.745 "seek_hole": false, 00:21:32.745 "seek_data": false, 00:21:32.745 "copy": true, 00:21:32.745 "nvme_iov_md": false 00:21:32.745 }, 00:21:32.745 "memory_domains": [ 00:21:32.745 { 00:21:32.745 "dma_device_id": "system", 00:21:32.745 "dma_device_type": 1 00:21:32.745 }, 00:21:32.745 { 00:21:32.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.745 "dma_device_type": 2 00:21:32.745 } 00:21:32.745 ], 00:21:32.745 "driver_specific": {} 00:21:32.745 }' 00:21:32.745 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.745 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.745 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:32.745 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.745 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.745 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:32.745 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.745 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.004 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:33.004 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.004 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.004 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:33.004 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:33.004 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:33.004 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:33.264 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:33.264 "name": "BaseBdev4", 00:21:33.264 "aliases": [ 00:21:33.264 "b47de32c-b4a2-478b-ac06-c3856cfc4eca" 00:21:33.264 ], 00:21:33.264 "product_name": "Malloc disk", 00:21:33.264 "block_size": 512, 00:21:33.264 "num_blocks": 65536, 00:21:33.264 "uuid": "b47de32c-b4a2-478b-ac06-c3856cfc4eca", 00:21:33.264 "assigned_rate_limits": { 00:21:33.264 "rw_ios_per_sec": 0, 00:21:33.264 "rw_mbytes_per_sec": 0, 00:21:33.264 "r_mbytes_per_sec": 0, 00:21:33.264 "w_mbytes_per_sec": 0 00:21:33.264 }, 00:21:33.264 "claimed": true, 00:21:33.264 "claim_type": "exclusive_write", 00:21:33.264 "zoned": false, 00:21:33.264 "supported_io_types": { 00:21:33.264 "read": true, 00:21:33.264 "write": true, 00:21:33.264 "unmap": true, 00:21:33.264 "flush": true, 00:21:33.264 "reset": true, 00:21:33.264 "nvme_admin": false, 00:21:33.264 "nvme_io": false, 00:21:33.264 "nvme_io_md": false, 00:21:33.264 "write_zeroes": true, 00:21:33.264 "zcopy": true, 00:21:33.264 "get_zone_info": false, 00:21:33.264 "zone_management": false, 00:21:33.264 "zone_append": false, 00:21:33.264 "compare": false, 00:21:33.264 "compare_and_write": false, 00:21:33.264 "abort": true, 00:21:33.264 "seek_hole": false, 00:21:33.264 "seek_data": false, 00:21:33.264 "copy": true, 00:21:33.264 "nvme_iov_md": false 00:21:33.264 }, 00:21:33.264 "memory_domains": [ 00:21:33.264 { 00:21:33.264 "dma_device_id": "system", 00:21:33.264 "dma_device_type": 1 00:21:33.264 }, 00:21:33.264 { 00:21:33.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.264 "dma_device_type": 2 00:21:33.264 } 00:21:33.264 ], 00:21:33.264 "driver_specific": {} 00:21:33.264 }' 00:21:33.264 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.264 05:49:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.264 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:33.264 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.264 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.264 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:33.264 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.264 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.523 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:33.523 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.523 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.523 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:33.523 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:33.783 [2024-07-26 05:49:48.431993] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:33.783 [2024-07-26 05:49:48.432021] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:33.783 [2024-07-26 05:49:48.432068] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.783 "name": "Existed_Raid", 00:21:33.783 "uuid": "bdf8063b-0e87-4cff-abc0-35c6b4774b1d", 00:21:33.783 "strip_size_kb": 64, 00:21:33.783 "state": "offline", 00:21:33.783 "raid_level": "concat", 00:21:33.783 "superblock": false, 00:21:33.783 "num_base_bdevs": 4, 00:21:33.783 "num_base_bdevs_discovered": 3, 00:21:33.783 "num_base_bdevs_operational": 3, 00:21:33.783 "base_bdevs_list": [ 00:21:33.783 { 00:21:33.783 "name": null, 00:21:33.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.783 "is_configured": false, 00:21:33.783 "data_offset": 0, 00:21:33.783 "data_size": 65536 00:21:33.783 }, 00:21:33.783 { 00:21:33.783 "name": "BaseBdev2", 00:21:33.783 "uuid": "1b1b267c-6838-4d6d-b21c-52b7b61197e7", 00:21:33.783 "is_configured": true, 00:21:33.783 "data_offset": 0, 00:21:33.783 "data_size": 65536 00:21:33.783 }, 00:21:33.783 { 00:21:33.783 "name": "BaseBdev3", 00:21:33.783 "uuid": "ba5bf631-46dc-4ca1-840e-d9f8192e6f50", 00:21:33.783 "is_configured": true, 00:21:33.783 "data_offset": 0, 00:21:33.783 "data_size": 65536 00:21:33.783 }, 00:21:33.783 { 00:21:33.783 "name": "BaseBdev4", 00:21:33.783 "uuid": "b47de32c-b4a2-478b-ac06-c3856cfc4eca", 00:21:33.783 "is_configured": true, 00:21:33.783 "data_offset": 0, 00:21:33.783 "data_size": 65536 00:21:33.783 } 00:21:33.783 ] 00:21:33.783 }' 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.783 05:49:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:34.351 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:34.351 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:34.351 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:34.351 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.611 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:34.611 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:34.611 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:34.870 [2024-07-26 05:49:49.604125] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:34.870 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:34.870 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:34.870 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.870 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:35.129 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:35.129 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:35.129 05:49:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:35.389 [2024-07-26 05:49:50.133881] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:35.389 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:35.389 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:35.389 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.389 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:35.647 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:35.648 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:35.648 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:35.906 [2024-07-26 05:49:50.631758] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:35.906 [2024-07-26 05:49:50.631804] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d7350 name Existed_Raid, state offline 00:21:35.906 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:35.906 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:35.906 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.906 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:36.164 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:36.164 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:36.164 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:36.164 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:36.164 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:36.164 05:49:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:36.424 BaseBdev2 00:21:36.424 05:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:36.424 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:36.424 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:36.424 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:36.424 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:36.424 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:36.424 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:36.683 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:36.942 [ 00:21:36.942 { 00:21:36.942 "name": "BaseBdev2", 00:21:36.942 "aliases": [ 00:21:36.942 "2f09d0b8-17ad-4259-a503-1e1e5597fe1a" 00:21:36.942 ], 00:21:36.942 "product_name": "Malloc disk", 00:21:36.942 "block_size": 512, 00:21:36.942 "num_blocks": 65536, 00:21:36.942 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:36.942 "assigned_rate_limits": { 00:21:36.942 "rw_ios_per_sec": 0, 00:21:36.942 "rw_mbytes_per_sec": 0, 00:21:36.942 "r_mbytes_per_sec": 0, 00:21:36.942 "w_mbytes_per_sec": 0 00:21:36.942 }, 00:21:36.942 "claimed": false, 00:21:36.942 "zoned": false, 00:21:36.942 "supported_io_types": { 00:21:36.942 "read": true, 00:21:36.942 "write": true, 00:21:36.942 "unmap": true, 00:21:36.942 "flush": true, 00:21:36.942 "reset": true, 00:21:36.942 "nvme_admin": false, 00:21:36.942 "nvme_io": false, 00:21:36.942 "nvme_io_md": false, 00:21:36.942 "write_zeroes": true, 00:21:36.942 "zcopy": true, 00:21:36.942 "get_zone_info": false, 00:21:36.942 "zone_management": false, 00:21:36.942 "zone_append": false, 00:21:36.942 "compare": false, 00:21:36.942 "compare_and_write": false, 00:21:36.942 "abort": true, 00:21:36.942 "seek_hole": false, 00:21:36.942 "seek_data": false, 00:21:36.942 "copy": true, 00:21:36.942 "nvme_iov_md": false 00:21:36.942 }, 00:21:36.942 "memory_domains": [ 00:21:36.942 { 00:21:36.942 "dma_device_id": "system", 00:21:36.942 "dma_device_type": 1 00:21:36.942 }, 00:21:36.942 { 00:21:36.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:36.942 "dma_device_type": 2 00:21:36.942 } 00:21:36.942 ], 00:21:36.942 "driver_specific": {} 00:21:36.942 } 00:21:36.942 ] 00:21:36.942 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:36.942 05:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:36.942 05:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:36.942 05:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:36.942 BaseBdev3 00:21:37.201 05:49:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:37.201 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:37.201 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:37.201 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:37.201 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:37.201 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:37.201 05:49:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:37.201 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:37.460 [ 00:21:37.460 { 00:21:37.460 "name": "BaseBdev3", 00:21:37.460 "aliases": [ 00:21:37.460 "4c7d99d4-fce7-42a1-bc7f-411fb0552869" 00:21:37.460 ], 00:21:37.460 "product_name": "Malloc disk", 00:21:37.460 "block_size": 512, 00:21:37.460 "num_blocks": 65536, 00:21:37.460 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:37.460 "assigned_rate_limits": { 00:21:37.460 "rw_ios_per_sec": 0, 00:21:37.460 "rw_mbytes_per_sec": 0, 00:21:37.460 "r_mbytes_per_sec": 0, 00:21:37.460 "w_mbytes_per_sec": 0 00:21:37.460 }, 00:21:37.460 "claimed": false, 00:21:37.460 "zoned": false, 00:21:37.460 "supported_io_types": { 00:21:37.460 "read": true, 00:21:37.460 "write": true, 00:21:37.460 "unmap": true, 00:21:37.460 "flush": true, 00:21:37.460 "reset": true, 00:21:37.460 "nvme_admin": false, 00:21:37.460 "nvme_io": false, 00:21:37.460 "nvme_io_md": false, 00:21:37.460 "write_zeroes": true, 00:21:37.460 "zcopy": true, 00:21:37.460 "get_zone_info": false, 00:21:37.460 "zone_management": false, 00:21:37.460 "zone_append": false, 00:21:37.460 "compare": false, 00:21:37.460 "compare_and_write": false, 00:21:37.460 "abort": true, 00:21:37.460 "seek_hole": false, 00:21:37.460 "seek_data": false, 00:21:37.460 "copy": true, 00:21:37.460 "nvme_iov_md": false 00:21:37.460 }, 00:21:37.460 "memory_domains": [ 00:21:37.460 { 00:21:37.460 "dma_device_id": "system", 00:21:37.460 "dma_device_type": 1 00:21:37.460 }, 00:21:37.460 { 00:21:37.460 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.460 "dma_device_type": 2 00:21:37.460 } 00:21:37.460 ], 00:21:37.460 "driver_specific": {} 00:21:37.460 } 00:21:37.460 ] 00:21:37.460 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:37.460 05:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:37.460 05:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:37.460 05:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:37.719 BaseBdev4 00:21:37.719 05:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:37.720 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:37.720 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:37.720 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:37.720 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:37.720 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:37.720 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:37.978 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:38.238 [ 00:21:38.238 { 00:21:38.238 "name": "BaseBdev4", 00:21:38.238 "aliases": [ 00:21:38.238 "61ba4d70-efbc-4063-ad25-8c163c32f61e" 00:21:38.238 ], 00:21:38.238 "product_name": "Malloc disk", 00:21:38.238 "block_size": 512, 00:21:38.238 "num_blocks": 65536, 00:21:38.238 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:38.238 "assigned_rate_limits": { 00:21:38.238 "rw_ios_per_sec": 0, 00:21:38.238 "rw_mbytes_per_sec": 0, 00:21:38.238 "r_mbytes_per_sec": 0, 00:21:38.238 "w_mbytes_per_sec": 0 00:21:38.238 }, 00:21:38.238 "claimed": false, 00:21:38.238 "zoned": false, 00:21:38.238 "supported_io_types": { 00:21:38.238 "read": true, 00:21:38.238 "write": true, 00:21:38.238 "unmap": true, 00:21:38.238 "flush": true, 00:21:38.238 "reset": true, 00:21:38.238 "nvme_admin": false, 00:21:38.238 "nvme_io": false, 00:21:38.238 "nvme_io_md": false, 00:21:38.238 "write_zeroes": true, 00:21:38.238 "zcopy": true, 00:21:38.238 "get_zone_info": false, 00:21:38.238 "zone_management": false, 00:21:38.238 "zone_append": false, 00:21:38.238 "compare": false, 00:21:38.238 "compare_and_write": false, 00:21:38.238 "abort": true, 00:21:38.238 "seek_hole": false, 00:21:38.238 "seek_data": false, 00:21:38.238 "copy": true, 00:21:38.238 "nvme_iov_md": false 00:21:38.238 }, 00:21:38.238 "memory_domains": [ 00:21:38.238 { 00:21:38.238 "dma_device_id": "system", 00:21:38.238 "dma_device_type": 1 00:21:38.238 }, 00:21:38.238 { 00:21:38.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.238 "dma_device_type": 2 00:21:38.238 } 00:21:38.238 ], 00:21:38.238 "driver_specific": {} 00:21:38.238 } 00:21:38.238 ] 00:21:38.238 05:49:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:38.238 05:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:38.238 05:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:38.238 05:49:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:38.497 [2024-07-26 05:49:53.202559] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:38.497 [2024-07-26 05:49:53.202602] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:38.497 [2024-07-26 05:49:53.202621] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:38.497 [2024-07-26 05:49:53.203942] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:38.497 [2024-07-26 05:49:53.203984] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:38.497 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:38.497 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:38.497 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:38.497 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:38.497 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:38.497 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:38.497 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:38.497 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:38.497 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:38.498 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:38.498 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.498 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:38.757 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:38.757 "name": "Existed_Raid", 00:21:38.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.757 "strip_size_kb": 64, 00:21:38.757 "state": "configuring", 00:21:38.757 "raid_level": "concat", 00:21:38.757 "superblock": false, 00:21:38.757 "num_base_bdevs": 4, 00:21:38.757 "num_base_bdevs_discovered": 3, 00:21:38.757 "num_base_bdevs_operational": 4, 00:21:38.757 "base_bdevs_list": [ 00:21:38.757 { 00:21:38.757 "name": "BaseBdev1", 00:21:38.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.757 "is_configured": false, 00:21:38.757 "data_offset": 0, 00:21:38.757 "data_size": 0 00:21:38.757 }, 00:21:38.757 { 00:21:38.757 "name": "BaseBdev2", 00:21:38.757 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:38.757 "is_configured": true, 00:21:38.757 "data_offset": 0, 00:21:38.757 "data_size": 65536 00:21:38.757 }, 00:21:38.757 { 00:21:38.757 "name": "BaseBdev3", 00:21:38.757 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:38.757 "is_configured": true, 00:21:38.757 "data_offset": 0, 00:21:38.757 "data_size": 65536 00:21:38.757 }, 00:21:38.757 { 00:21:38.757 "name": "BaseBdev4", 00:21:38.757 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:38.757 "is_configured": true, 00:21:38.757 "data_offset": 0, 00:21:38.757 "data_size": 65536 00:21:38.757 } 00:21:38.757 ] 00:21:38.757 }' 00:21:38.757 05:49:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:38.757 05:49:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:39.325 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:39.583 [2024-07-26 05:49:54.281391] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:39.583 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.584 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:39.843 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.843 "name": "Existed_Raid", 00:21:39.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.843 "strip_size_kb": 64, 00:21:39.843 "state": "configuring", 00:21:39.843 "raid_level": "concat", 00:21:39.843 "superblock": false, 00:21:39.843 "num_base_bdevs": 4, 00:21:39.843 "num_base_bdevs_discovered": 2, 00:21:39.843 "num_base_bdevs_operational": 4, 00:21:39.843 "base_bdevs_list": [ 00:21:39.843 { 00:21:39.843 "name": "BaseBdev1", 00:21:39.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.843 "is_configured": false, 00:21:39.843 "data_offset": 0, 00:21:39.843 "data_size": 0 00:21:39.843 }, 00:21:39.843 { 00:21:39.843 "name": null, 00:21:39.843 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:39.843 "is_configured": false, 00:21:39.843 "data_offset": 0, 00:21:39.843 "data_size": 65536 00:21:39.843 }, 00:21:39.843 { 00:21:39.843 "name": "BaseBdev3", 00:21:39.843 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:39.843 "is_configured": true, 00:21:39.843 "data_offset": 0, 00:21:39.843 "data_size": 65536 00:21:39.843 }, 00:21:39.843 { 00:21:39.843 "name": "BaseBdev4", 00:21:39.843 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:39.843 "is_configured": true, 00:21:39.843 "data_offset": 0, 00:21:39.843 "data_size": 65536 00:21:39.843 } 00:21:39.843 ] 00:21:39.843 }' 00:21:39.843 05:49:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.843 05:49:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.410 05:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.410 05:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:40.669 05:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:40.669 05:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:40.928 [2024-07-26 05:49:55.617693] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:40.928 BaseBdev1 00:21:40.928 05:49:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:40.928 05:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:40.928 05:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:40.928 05:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:40.928 05:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:40.928 05:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:40.928 05:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:41.187 05:49:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:41.447 [ 00:21:41.447 { 00:21:41.447 "name": "BaseBdev1", 00:21:41.447 "aliases": [ 00:21:41.447 "8d2ec57f-60da-4add-9e72-de559f669453" 00:21:41.447 ], 00:21:41.447 "product_name": "Malloc disk", 00:21:41.447 "block_size": 512, 00:21:41.447 "num_blocks": 65536, 00:21:41.447 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:41.447 "assigned_rate_limits": { 00:21:41.447 "rw_ios_per_sec": 0, 00:21:41.447 "rw_mbytes_per_sec": 0, 00:21:41.447 "r_mbytes_per_sec": 0, 00:21:41.447 "w_mbytes_per_sec": 0 00:21:41.447 }, 00:21:41.447 "claimed": true, 00:21:41.447 "claim_type": "exclusive_write", 00:21:41.447 "zoned": false, 00:21:41.447 "supported_io_types": { 00:21:41.447 "read": true, 00:21:41.447 "write": true, 00:21:41.447 "unmap": true, 00:21:41.447 "flush": true, 00:21:41.447 "reset": true, 00:21:41.447 "nvme_admin": false, 00:21:41.447 "nvme_io": false, 00:21:41.447 "nvme_io_md": false, 00:21:41.447 "write_zeroes": true, 00:21:41.447 "zcopy": true, 00:21:41.447 "get_zone_info": false, 00:21:41.447 "zone_management": false, 00:21:41.447 "zone_append": false, 00:21:41.447 "compare": false, 00:21:41.447 "compare_and_write": false, 00:21:41.447 "abort": true, 00:21:41.447 "seek_hole": false, 00:21:41.447 "seek_data": false, 00:21:41.447 "copy": true, 00:21:41.447 "nvme_iov_md": false 00:21:41.447 }, 00:21:41.447 "memory_domains": [ 00:21:41.447 { 00:21:41.447 "dma_device_id": "system", 00:21:41.447 "dma_device_type": 1 00:21:41.447 }, 00:21:41.447 { 00:21:41.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:41.447 "dma_device_type": 2 00:21:41.447 } 00:21:41.447 ], 00:21:41.447 "driver_specific": {} 00:21:41.447 } 00:21:41.447 ] 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.447 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:41.706 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.706 "name": "Existed_Raid", 00:21:41.706 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.706 "strip_size_kb": 64, 00:21:41.706 "state": "configuring", 00:21:41.706 "raid_level": "concat", 00:21:41.706 "superblock": false, 00:21:41.706 "num_base_bdevs": 4, 00:21:41.706 "num_base_bdevs_discovered": 3, 00:21:41.706 "num_base_bdevs_operational": 4, 00:21:41.706 "base_bdevs_list": [ 00:21:41.706 { 00:21:41.706 "name": "BaseBdev1", 00:21:41.706 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:41.706 "is_configured": true, 00:21:41.706 "data_offset": 0, 00:21:41.706 "data_size": 65536 00:21:41.706 }, 00:21:41.706 { 00:21:41.706 "name": null, 00:21:41.706 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:41.706 "is_configured": false, 00:21:41.706 "data_offset": 0, 00:21:41.706 "data_size": 65536 00:21:41.706 }, 00:21:41.706 { 00:21:41.706 "name": "BaseBdev3", 00:21:41.706 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:41.706 "is_configured": true, 00:21:41.706 "data_offset": 0, 00:21:41.706 "data_size": 65536 00:21:41.706 }, 00:21:41.706 { 00:21:41.706 "name": "BaseBdev4", 00:21:41.706 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:41.706 "is_configured": true, 00:21:41.706 "data_offset": 0, 00:21:41.706 "data_size": 65536 00:21:41.706 } 00:21:41.706 ] 00:21:41.706 }' 00:21:41.706 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.706 05:49:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.273 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:42.273 05:49:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.532 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:42.532 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:42.533 [2024-07-26 05:49:57.438532] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.792 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:43.051 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:43.052 "name": "Existed_Raid", 00:21:43.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.052 "strip_size_kb": 64, 00:21:43.052 "state": "configuring", 00:21:43.052 "raid_level": "concat", 00:21:43.052 "superblock": false, 00:21:43.052 "num_base_bdevs": 4, 00:21:43.052 "num_base_bdevs_discovered": 2, 00:21:43.052 "num_base_bdevs_operational": 4, 00:21:43.052 "base_bdevs_list": [ 00:21:43.052 { 00:21:43.052 "name": "BaseBdev1", 00:21:43.052 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:43.052 "is_configured": true, 00:21:43.052 "data_offset": 0, 00:21:43.052 "data_size": 65536 00:21:43.052 }, 00:21:43.052 { 00:21:43.052 "name": null, 00:21:43.052 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:43.052 "is_configured": false, 00:21:43.052 "data_offset": 0, 00:21:43.052 "data_size": 65536 00:21:43.052 }, 00:21:43.052 { 00:21:43.052 "name": null, 00:21:43.052 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:43.052 "is_configured": false, 00:21:43.052 "data_offset": 0, 00:21:43.052 "data_size": 65536 00:21:43.052 }, 00:21:43.052 { 00:21:43.052 "name": "BaseBdev4", 00:21:43.052 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:43.052 "is_configured": true, 00:21:43.052 "data_offset": 0, 00:21:43.052 "data_size": 65536 00:21:43.052 } 00:21:43.052 ] 00:21:43.052 }' 00:21:43.052 05:49:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:43.052 05:49:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:43.694 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.694 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:43.694 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:43.694 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:43.953 [2024-07-26 05:49:58.713923] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.953 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:44.212 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:44.212 "name": "Existed_Raid", 00:21:44.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.212 "strip_size_kb": 64, 00:21:44.212 "state": "configuring", 00:21:44.212 "raid_level": "concat", 00:21:44.212 "superblock": false, 00:21:44.212 "num_base_bdevs": 4, 00:21:44.212 "num_base_bdevs_discovered": 3, 00:21:44.212 "num_base_bdevs_operational": 4, 00:21:44.212 "base_bdevs_list": [ 00:21:44.212 { 00:21:44.212 "name": "BaseBdev1", 00:21:44.212 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:44.212 "is_configured": true, 00:21:44.212 "data_offset": 0, 00:21:44.212 "data_size": 65536 00:21:44.212 }, 00:21:44.212 { 00:21:44.212 "name": null, 00:21:44.212 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:44.212 "is_configured": false, 00:21:44.212 "data_offset": 0, 00:21:44.212 "data_size": 65536 00:21:44.212 }, 00:21:44.212 { 00:21:44.212 "name": "BaseBdev3", 00:21:44.212 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:44.212 "is_configured": true, 00:21:44.212 "data_offset": 0, 00:21:44.212 "data_size": 65536 00:21:44.212 }, 00:21:44.212 { 00:21:44.212 "name": "BaseBdev4", 00:21:44.212 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:44.212 "is_configured": true, 00:21:44.212 "data_offset": 0, 00:21:44.212 "data_size": 65536 00:21:44.212 } 00:21:44.212 ] 00:21:44.212 }' 00:21:44.212 05:49:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:44.212 05:49:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:44.780 05:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.780 05:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:45.039 05:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:45.039 05:49:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:45.298 [2024-07-26 05:50:00.081571] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.298 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.557 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.557 "name": "Existed_Raid", 00:21:45.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.557 "strip_size_kb": 64, 00:21:45.557 "state": "configuring", 00:21:45.557 "raid_level": "concat", 00:21:45.557 "superblock": false, 00:21:45.557 "num_base_bdevs": 4, 00:21:45.557 "num_base_bdevs_discovered": 2, 00:21:45.557 "num_base_bdevs_operational": 4, 00:21:45.557 "base_bdevs_list": [ 00:21:45.557 { 00:21:45.557 "name": null, 00:21:45.557 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:45.557 "is_configured": false, 00:21:45.557 "data_offset": 0, 00:21:45.557 "data_size": 65536 00:21:45.557 }, 00:21:45.557 { 00:21:45.557 "name": null, 00:21:45.557 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:45.557 "is_configured": false, 00:21:45.557 "data_offset": 0, 00:21:45.557 "data_size": 65536 00:21:45.557 }, 00:21:45.557 { 00:21:45.557 "name": "BaseBdev3", 00:21:45.557 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:45.557 "is_configured": true, 00:21:45.557 "data_offset": 0, 00:21:45.557 "data_size": 65536 00:21:45.557 }, 00:21:45.557 { 00:21:45.557 "name": "BaseBdev4", 00:21:45.557 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:45.557 "is_configured": true, 00:21:45.557 "data_offset": 0, 00:21:45.557 "data_size": 65536 00:21:45.557 } 00:21:45.557 ] 00:21:45.557 }' 00:21:45.557 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.557 05:50:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:46.124 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.124 05:50:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:46.383 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:46.383 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:46.643 [2024-07-26 05:50:01.443744] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.643 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:46.901 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:46.901 "name": "Existed_Raid", 00:21:46.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.901 "strip_size_kb": 64, 00:21:46.901 "state": "configuring", 00:21:46.901 "raid_level": "concat", 00:21:46.901 "superblock": false, 00:21:46.901 "num_base_bdevs": 4, 00:21:46.901 "num_base_bdevs_discovered": 3, 00:21:46.901 "num_base_bdevs_operational": 4, 00:21:46.901 "base_bdevs_list": [ 00:21:46.901 { 00:21:46.901 "name": null, 00:21:46.901 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:46.901 "is_configured": false, 00:21:46.901 "data_offset": 0, 00:21:46.901 "data_size": 65536 00:21:46.901 }, 00:21:46.901 { 00:21:46.901 "name": "BaseBdev2", 00:21:46.901 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:46.902 "is_configured": true, 00:21:46.902 "data_offset": 0, 00:21:46.902 "data_size": 65536 00:21:46.902 }, 00:21:46.902 { 00:21:46.902 "name": "BaseBdev3", 00:21:46.902 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:46.902 "is_configured": true, 00:21:46.902 "data_offset": 0, 00:21:46.902 "data_size": 65536 00:21:46.902 }, 00:21:46.902 { 00:21:46.902 "name": "BaseBdev4", 00:21:46.902 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:46.902 "is_configured": true, 00:21:46.902 "data_offset": 0, 00:21:46.902 "data_size": 65536 00:21:46.902 } 00:21:46.902 ] 00:21:46.902 }' 00:21:46.902 05:50:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:46.902 05:50:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:47.469 05:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.469 05:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:47.728 05:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:47.728 05:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.728 05:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:47.996 05:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8d2ec57f-60da-4add-9e72-de559f669453 00:21:48.258 [2024-07-26 05:50:02.960341] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:48.258 [2024-07-26 05:50:02.960382] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12db040 00:21:48.258 [2024-07-26 05:50:02.960390] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:21:48.258 [2024-07-26 05:50:02.960589] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12d6a70 00:21:48.259 [2024-07-26 05:50:02.960717] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12db040 00:21:48.259 [2024-07-26 05:50:02.960728] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12db040 00:21:48.259 [2024-07-26 05:50:02.960894] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:48.259 NewBaseBdev 00:21:48.259 05:50:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:48.259 05:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:48.259 05:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:48.259 05:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:48.259 05:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:48.259 05:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:48.259 05:50:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:48.517 05:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:48.776 [ 00:21:48.776 { 00:21:48.776 "name": "NewBaseBdev", 00:21:48.776 "aliases": [ 00:21:48.776 "8d2ec57f-60da-4add-9e72-de559f669453" 00:21:48.776 ], 00:21:48.776 "product_name": "Malloc disk", 00:21:48.776 "block_size": 512, 00:21:48.776 "num_blocks": 65536, 00:21:48.776 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:48.776 "assigned_rate_limits": { 00:21:48.776 "rw_ios_per_sec": 0, 00:21:48.776 "rw_mbytes_per_sec": 0, 00:21:48.776 "r_mbytes_per_sec": 0, 00:21:48.776 "w_mbytes_per_sec": 0 00:21:48.776 }, 00:21:48.776 "claimed": true, 00:21:48.776 "claim_type": "exclusive_write", 00:21:48.776 "zoned": false, 00:21:48.776 "supported_io_types": { 00:21:48.776 "read": true, 00:21:48.776 "write": true, 00:21:48.776 "unmap": true, 00:21:48.776 "flush": true, 00:21:48.776 "reset": true, 00:21:48.776 "nvme_admin": false, 00:21:48.776 "nvme_io": false, 00:21:48.776 "nvme_io_md": false, 00:21:48.776 "write_zeroes": true, 00:21:48.776 "zcopy": true, 00:21:48.776 "get_zone_info": false, 00:21:48.777 "zone_management": false, 00:21:48.777 "zone_append": false, 00:21:48.777 "compare": false, 00:21:48.777 "compare_and_write": false, 00:21:48.777 "abort": true, 00:21:48.777 "seek_hole": false, 00:21:48.777 "seek_data": false, 00:21:48.777 "copy": true, 00:21:48.777 "nvme_iov_md": false 00:21:48.777 }, 00:21:48.777 "memory_domains": [ 00:21:48.777 { 00:21:48.777 "dma_device_id": "system", 00:21:48.777 "dma_device_type": 1 00:21:48.777 }, 00:21:48.777 { 00:21:48.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.777 "dma_device_type": 2 00:21:48.777 } 00:21:48.777 ], 00:21:48.777 "driver_specific": {} 00:21:48.777 } 00:21:48.777 ] 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.777 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:49.042 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.042 "name": "Existed_Raid", 00:21:49.042 "uuid": "f8cb6f9a-fc35-49c0-a625-5169849872ac", 00:21:49.042 "strip_size_kb": 64, 00:21:49.042 "state": "online", 00:21:49.042 "raid_level": "concat", 00:21:49.042 "superblock": false, 00:21:49.042 "num_base_bdevs": 4, 00:21:49.042 "num_base_bdevs_discovered": 4, 00:21:49.042 "num_base_bdevs_operational": 4, 00:21:49.042 "base_bdevs_list": [ 00:21:49.042 { 00:21:49.042 "name": "NewBaseBdev", 00:21:49.042 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:49.042 "is_configured": true, 00:21:49.042 "data_offset": 0, 00:21:49.042 "data_size": 65536 00:21:49.042 }, 00:21:49.042 { 00:21:49.042 "name": "BaseBdev2", 00:21:49.042 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:49.042 "is_configured": true, 00:21:49.042 "data_offset": 0, 00:21:49.042 "data_size": 65536 00:21:49.042 }, 00:21:49.042 { 00:21:49.042 "name": "BaseBdev3", 00:21:49.042 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:49.042 "is_configured": true, 00:21:49.042 "data_offset": 0, 00:21:49.042 "data_size": 65536 00:21:49.042 }, 00:21:49.042 { 00:21:49.042 "name": "BaseBdev4", 00:21:49.042 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:49.042 "is_configured": true, 00:21:49.042 "data_offset": 0, 00:21:49.042 "data_size": 65536 00:21:49.042 } 00:21:49.042 ] 00:21:49.042 }' 00:21:49.042 05:50:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.042 05:50:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:49.610 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:49.610 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:49.610 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:49.610 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:49.610 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:49.610 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:49.610 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:49.610 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:49.869 [2024-07-26 05:50:04.560936] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:49.869 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:49.869 "name": "Existed_Raid", 00:21:49.869 "aliases": [ 00:21:49.869 "f8cb6f9a-fc35-49c0-a625-5169849872ac" 00:21:49.869 ], 00:21:49.869 "product_name": "Raid Volume", 00:21:49.869 "block_size": 512, 00:21:49.869 "num_blocks": 262144, 00:21:49.869 "uuid": "f8cb6f9a-fc35-49c0-a625-5169849872ac", 00:21:49.869 "assigned_rate_limits": { 00:21:49.869 "rw_ios_per_sec": 0, 00:21:49.869 "rw_mbytes_per_sec": 0, 00:21:49.869 "r_mbytes_per_sec": 0, 00:21:49.869 "w_mbytes_per_sec": 0 00:21:49.869 }, 00:21:49.869 "claimed": false, 00:21:49.869 "zoned": false, 00:21:49.869 "supported_io_types": { 00:21:49.869 "read": true, 00:21:49.869 "write": true, 00:21:49.869 "unmap": true, 00:21:49.869 "flush": true, 00:21:49.869 "reset": true, 00:21:49.869 "nvme_admin": false, 00:21:49.869 "nvme_io": false, 00:21:49.869 "nvme_io_md": false, 00:21:49.869 "write_zeroes": true, 00:21:49.869 "zcopy": false, 00:21:49.869 "get_zone_info": false, 00:21:49.869 "zone_management": false, 00:21:49.869 "zone_append": false, 00:21:49.869 "compare": false, 00:21:49.869 "compare_and_write": false, 00:21:49.869 "abort": false, 00:21:49.869 "seek_hole": false, 00:21:49.869 "seek_data": false, 00:21:49.869 "copy": false, 00:21:49.869 "nvme_iov_md": false 00:21:49.869 }, 00:21:49.869 "memory_domains": [ 00:21:49.869 { 00:21:49.869 "dma_device_id": "system", 00:21:49.869 "dma_device_type": 1 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.869 "dma_device_type": 2 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "dma_device_id": "system", 00:21:49.869 "dma_device_type": 1 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.869 "dma_device_type": 2 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "dma_device_id": "system", 00:21:49.869 "dma_device_type": 1 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.869 "dma_device_type": 2 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "dma_device_id": "system", 00:21:49.869 "dma_device_type": 1 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.869 "dma_device_type": 2 00:21:49.869 } 00:21:49.869 ], 00:21:49.869 "driver_specific": { 00:21:49.869 "raid": { 00:21:49.869 "uuid": "f8cb6f9a-fc35-49c0-a625-5169849872ac", 00:21:49.869 "strip_size_kb": 64, 00:21:49.869 "state": "online", 00:21:49.869 "raid_level": "concat", 00:21:49.869 "superblock": false, 00:21:49.869 "num_base_bdevs": 4, 00:21:49.869 "num_base_bdevs_discovered": 4, 00:21:49.869 "num_base_bdevs_operational": 4, 00:21:49.869 "base_bdevs_list": [ 00:21:49.869 { 00:21:49.869 "name": "NewBaseBdev", 00:21:49.869 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:49.869 "is_configured": true, 00:21:49.869 "data_offset": 0, 00:21:49.869 "data_size": 65536 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "name": "BaseBdev2", 00:21:49.869 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:49.869 "is_configured": true, 00:21:49.869 "data_offset": 0, 00:21:49.869 "data_size": 65536 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "name": "BaseBdev3", 00:21:49.869 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:49.869 "is_configured": true, 00:21:49.869 "data_offset": 0, 00:21:49.869 "data_size": 65536 00:21:49.869 }, 00:21:49.869 { 00:21:49.869 "name": "BaseBdev4", 00:21:49.869 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:49.869 "is_configured": true, 00:21:49.869 "data_offset": 0, 00:21:49.869 "data_size": 65536 00:21:49.869 } 00:21:49.869 ] 00:21:49.869 } 00:21:49.869 } 00:21:49.869 }' 00:21:49.869 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:49.869 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:49.869 BaseBdev2 00:21:49.869 BaseBdev3 00:21:49.869 BaseBdev4' 00:21:49.869 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:49.869 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:49.869 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:50.127 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:50.127 "name": "NewBaseBdev", 00:21:50.127 "aliases": [ 00:21:50.127 "8d2ec57f-60da-4add-9e72-de559f669453" 00:21:50.127 ], 00:21:50.127 "product_name": "Malloc disk", 00:21:50.127 "block_size": 512, 00:21:50.127 "num_blocks": 65536, 00:21:50.127 "uuid": "8d2ec57f-60da-4add-9e72-de559f669453", 00:21:50.127 "assigned_rate_limits": { 00:21:50.127 "rw_ios_per_sec": 0, 00:21:50.127 "rw_mbytes_per_sec": 0, 00:21:50.127 "r_mbytes_per_sec": 0, 00:21:50.127 "w_mbytes_per_sec": 0 00:21:50.127 }, 00:21:50.127 "claimed": true, 00:21:50.127 "claim_type": "exclusive_write", 00:21:50.127 "zoned": false, 00:21:50.127 "supported_io_types": { 00:21:50.127 "read": true, 00:21:50.127 "write": true, 00:21:50.127 "unmap": true, 00:21:50.127 "flush": true, 00:21:50.127 "reset": true, 00:21:50.127 "nvme_admin": false, 00:21:50.127 "nvme_io": false, 00:21:50.127 "nvme_io_md": false, 00:21:50.127 "write_zeroes": true, 00:21:50.127 "zcopy": true, 00:21:50.127 "get_zone_info": false, 00:21:50.127 "zone_management": false, 00:21:50.128 "zone_append": false, 00:21:50.128 "compare": false, 00:21:50.128 "compare_and_write": false, 00:21:50.128 "abort": true, 00:21:50.128 "seek_hole": false, 00:21:50.128 "seek_data": false, 00:21:50.128 "copy": true, 00:21:50.128 "nvme_iov_md": false 00:21:50.128 }, 00:21:50.128 "memory_domains": [ 00:21:50.128 { 00:21:50.128 "dma_device_id": "system", 00:21:50.128 "dma_device_type": 1 00:21:50.128 }, 00:21:50.128 { 00:21:50.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:50.128 "dma_device_type": 2 00:21:50.128 } 00:21:50.128 ], 00:21:50.128 "driver_specific": {} 00:21:50.128 }' 00:21:50.128 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:50.128 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:50.128 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:50.128 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:50.128 05:50:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:50.128 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:50.128 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:50.387 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:50.387 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:50.387 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:50.387 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:50.387 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:50.387 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:50.387 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:50.387 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:50.646 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:50.646 "name": "BaseBdev2", 00:21:50.646 "aliases": [ 00:21:50.646 "2f09d0b8-17ad-4259-a503-1e1e5597fe1a" 00:21:50.646 ], 00:21:50.646 "product_name": "Malloc disk", 00:21:50.646 "block_size": 512, 00:21:50.646 "num_blocks": 65536, 00:21:50.646 "uuid": "2f09d0b8-17ad-4259-a503-1e1e5597fe1a", 00:21:50.646 "assigned_rate_limits": { 00:21:50.646 "rw_ios_per_sec": 0, 00:21:50.646 "rw_mbytes_per_sec": 0, 00:21:50.646 "r_mbytes_per_sec": 0, 00:21:50.646 "w_mbytes_per_sec": 0 00:21:50.646 }, 00:21:50.646 "claimed": true, 00:21:50.646 "claim_type": "exclusive_write", 00:21:50.646 "zoned": false, 00:21:50.646 "supported_io_types": { 00:21:50.646 "read": true, 00:21:50.646 "write": true, 00:21:50.646 "unmap": true, 00:21:50.646 "flush": true, 00:21:50.646 "reset": true, 00:21:50.646 "nvme_admin": false, 00:21:50.646 "nvme_io": false, 00:21:50.646 "nvme_io_md": false, 00:21:50.646 "write_zeroes": true, 00:21:50.646 "zcopy": true, 00:21:50.646 "get_zone_info": false, 00:21:50.646 "zone_management": false, 00:21:50.646 "zone_append": false, 00:21:50.646 "compare": false, 00:21:50.646 "compare_and_write": false, 00:21:50.646 "abort": true, 00:21:50.646 "seek_hole": false, 00:21:50.646 "seek_data": false, 00:21:50.646 "copy": true, 00:21:50.646 "nvme_iov_md": false 00:21:50.646 }, 00:21:50.646 "memory_domains": [ 00:21:50.646 { 00:21:50.646 "dma_device_id": "system", 00:21:50.646 "dma_device_type": 1 00:21:50.646 }, 00:21:50.646 { 00:21:50.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:50.646 "dma_device_type": 2 00:21:50.646 } 00:21:50.646 ], 00:21:50.646 "driver_specific": {} 00:21:50.646 }' 00:21:50.646 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:50.646 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:50.646 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:50.646 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:50.646 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:50.646 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:50.646 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:50.646 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:50.904 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:50.904 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:50.904 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:50.904 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:50.904 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:50.904 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:50.904 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:51.163 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:51.163 "name": "BaseBdev3", 00:21:51.163 "aliases": [ 00:21:51.163 "4c7d99d4-fce7-42a1-bc7f-411fb0552869" 00:21:51.163 ], 00:21:51.163 "product_name": "Malloc disk", 00:21:51.163 "block_size": 512, 00:21:51.163 "num_blocks": 65536, 00:21:51.163 "uuid": "4c7d99d4-fce7-42a1-bc7f-411fb0552869", 00:21:51.163 "assigned_rate_limits": { 00:21:51.163 "rw_ios_per_sec": 0, 00:21:51.163 "rw_mbytes_per_sec": 0, 00:21:51.163 "r_mbytes_per_sec": 0, 00:21:51.163 "w_mbytes_per_sec": 0 00:21:51.163 }, 00:21:51.163 "claimed": true, 00:21:51.163 "claim_type": "exclusive_write", 00:21:51.163 "zoned": false, 00:21:51.163 "supported_io_types": { 00:21:51.163 "read": true, 00:21:51.163 "write": true, 00:21:51.163 "unmap": true, 00:21:51.163 "flush": true, 00:21:51.163 "reset": true, 00:21:51.163 "nvme_admin": false, 00:21:51.163 "nvme_io": false, 00:21:51.163 "nvme_io_md": false, 00:21:51.163 "write_zeroes": true, 00:21:51.163 "zcopy": true, 00:21:51.163 "get_zone_info": false, 00:21:51.163 "zone_management": false, 00:21:51.163 "zone_append": false, 00:21:51.163 "compare": false, 00:21:51.163 "compare_and_write": false, 00:21:51.163 "abort": true, 00:21:51.163 "seek_hole": false, 00:21:51.163 "seek_data": false, 00:21:51.163 "copy": true, 00:21:51.163 "nvme_iov_md": false 00:21:51.163 }, 00:21:51.163 "memory_domains": [ 00:21:51.163 { 00:21:51.163 "dma_device_id": "system", 00:21:51.163 "dma_device_type": 1 00:21:51.163 }, 00:21:51.163 { 00:21:51.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.163 "dma_device_type": 2 00:21:51.163 } 00:21:51.163 ], 00:21:51.163 "driver_specific": {} 00:21:51.163 }' 00:21:51.163 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.163 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.163 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:51.163 05:50:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:51.163 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:51.163 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:51.163 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:51.422 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:51.422 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:51.422 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:51.422 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:51.422 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:51.422 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:51.422 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:51.422 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:51.680 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:51.680 "name": "BaseBdev4", 00:21:51.680 "aliases": [ 00:21:51.680 "61ba4d70-efbc-4063-ad25-8c163c32f61e" 00:21:51.680 ], 00:21:51.680 "product_name": "Malloc disk", 00:21:51.680 "block_size": 512, 00:21:51.680 "num_blocks": 65536, 00:21:51.680 "uuid": "61ba4d70-efbc-4063-ad25-8c163c32f61e", 00:21:51.680 "assigned_rate_limits": { 00:21:51.680 "rw_ios_per_sec": 0, 00:21:51.680 "rw_mbytes_per_sec": 0, 00:21:51.680 "r_mbytes_per_sec": 0, 00:21:51.680 "w_mbytes_per_sec": 0 00:21:51.680 }, 00:21:51.680 "claimed": true, 00:21:51.680 "claim_type": "exclusive_write", 00:21:51.680 "zoned": false, 00:21:51.680 "supported_io_types": { 00:21:51.680 "read": true, 00:21:51.680 "write": true, 00:21:51.680 "unmap": true, 00:21:51.680 "flush": true, 00:21:51.680 "reset": true, 00:21:51.680 "nvme_admin": false, 00:21:51.680 "nvme_io": false, 00:21:51.680 "nvme_io_md": false, 00:21:51.680 "write_zeroes": true, 00:21:51.680 "zcopy": true, 00:21:51.680 "get_zone_info": false, 00:21:51.680 "zone_management": false, 00:21:51.680 "zone_append": false, 00:21:51.680 "compare": false, 00:21:51.680 "compare_and_write": false, 00:21:51.680 "abort": true, 00:21:51.680 "seek_hole": false, 00:21:51.680 "seek_data": false, 00:21:51.680 "copy": true, 00:21:51.680 "nvme_iov_md": false 00:21:51.680 }, 00:21:51.680 "memory_domains": [ 00:21:51.680 { 00:21:51.681 "dma_device_id": "system", 00:21:51.681 "dma_device_type": 1 00:21:51.681 }, 00:21:51.681 { 00:21:51.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.681 "dma_device_type": 2 00:21:51.681 } 00:21:51.681 ], 00:21:51.681 "driver_specific": {} 00:21:51.681 }' 00:21:51.681 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.681 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:51.939 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:51.939 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:51.939 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:51.939 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:51.939 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:51.939 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:51.939 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:51.939 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:51.939 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:52.197 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:52.197 05:50:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:52.197 [2024-07-26 05:50:07.087325] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:52.197 [2024-07-26 05:50:07.087353] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:52.197 [2024-07-26 05:50:07.087406] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:52.197 [2024-07-26 05:50:07.087465] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:52.197 [2024-07-26 05:50:07.087477] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12db040 name Existed_Raid, state offline 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1203059 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1203059 ']' 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1203059 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1203059 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1203059' 00:21:52.456 killing process with pid 1203059 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1203059 00:21:52.456 [2024-07-26 05:50:07.158996] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:52.456 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1203059 00:21:52.456 [2024-07-26 05:50:07.197258] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:52.715 05:50:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:52.715 00:21:52.715 real 0m31.208s 00:21:52.715 user 0m57.212s 00:21:52.715 sys 0m5.684s 00:21:52.715 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:52.715 05:50:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:52.715 ************************************ 00:21:52.715 END TEST raid_state_function_test 00:21:52.715 ************************************ 00:21:52.715 05:50:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:52.715 05:50:07 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:21:52.715 05:50:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:52.715 05:50:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:52.715 05:50:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:52.715 ************************************ 00:21:52.715 START TEST raid_state_function_test_sb 00:21:52.715 ************************************ 00:21:52.715 05:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:21:52.715 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:21:52.715 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:52.715 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1207757 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1207757' 00:21:52.716 Process raid pid: 1207757 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1207757 /var/tmp/spdk-raid.sock 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1207757 ']' 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:52.716 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:52.716 05:50:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:52.716 [2024-07-26 05:50:07.587537] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:21:52.716 [2024-07-26 05:50:07.587607] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:52.975 [2024-07-26 05:50:07.717794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:52.975 [2024-07-26 05:50:07.821100] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:53.233 [2024-07-26 05:50:07.889868] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:53.233 [2024-07-26 05:50:07.889899] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:53.801 05:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:53.801 05:50:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:53.801 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:54.060 [2024-07-26 05:50:08.751755] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:54.060 [2024-07-26 05:50:08.751794] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:54.060 [2024-07-26 05:50:08.751805] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:54.060 [2024-07-26 05:50:08.751817] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:54.060 [2024-07-26 05:50:08.751825] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:54.060 [2024-07-26 05:50:08.751836] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:54.060 [2024-07-26 05:50:08.751845] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:54.060 [2024-07-26 05:50:08.751856] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:54.060 05:50:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.318 05:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.318 "name": "Existed_Raid", 00:21:54.318 "uuid": "cbf12267-045e-457f-a5ca-c86acb52432e", 00:21:54.318 "strip_size_kb": 64, 00:21:54.318 "state": "configuring", 00:21:54.318 "raid_level": "concat", 00:21:54.318 "superblock": true, 00:21:54.318 "num_base_bdevs": 4, 00:21:54.318 "num_base_bdevs_discovered": 0, 00:21:54.318 "num_base_bdevs_operational": 4, 00:21:54.318 "base_bdevs_list": [ 00:21:54.318 { 00:21:54.318 "name": "BaseBdev1", 00:21:54.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.318 "is_configured": false, 00:21:54.318 "data_offset": 0, 00:21:54.318 "data_size": 0 00:21:54.318 }, 00:21:54.318 { 00:21:54.318 "name": "BaseBdev2", 00:21:54.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.318 "is_configured": false, 00:21:54.318 "data_offset": 0, 00:21:54.318 "data_size": 0 00:21:54.318 }, 00:21:54.318 { 00:21:54.318 "name": "BaseBdev3", 00:21:54.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.319 "is_configured": false, 00:21:54.319 "data_offset": 0, 00:21:54.319 "data_size": 0 00:21:54.319 }, 00:21:54.319 { 00:21:54.319 "name": "BaseBdev4", 00:21:54.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.319 "is_configured": false, 00:21:54.319 "data_offset": 0, 00:21:54.319 "data_size": 0 00:21:54.319 } 00:21:54.319 ] 00:21:54.319 }' 00:21:54.319 05:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.319 05:50:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:54.885 05:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:55.144 [2024-07-26 05:50:09.830443] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:55.144 [2024-07-26 05:50:09.830474] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x106faa0 name Existed_Raid, state configuring 00:21:55.144 05:50:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:55.403 [2024-07-26 05:50:10.079133] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:55.403 [2024-07-26 05:50:10.079164] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:55.403 [2024-07-26 05:50:10.079174] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:55.403 [2024-07-26 05:50:10.079185] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:55.403 [2024-07-26 05:50:10.079194] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:55.403 [2024-07-26 05:50:10.079205] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:55.403 [2024-07-26 05:50:10.079213] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:55.403 [2024-07-26 05:50:10.079224] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:55.403 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:55.662 [2024-07-26 05:50:10.329701] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:55.662 BaseBdev1 00:21:55.662 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:55.662 05:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:55.662 05:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:55.662 05:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:55.662 05:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:55.662 05:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:55.662 05:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:55.921 05:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:55.921 [ 00:21:55.921 { 00:21:55.921 "name": "BaseBdev1", 00:21:55.922 "aliases": [ 00:21:55.922 "a4cd56b9-4569-47ca-aa06-8bd9608ce900" 00:21:55.922 ], 00:21:55.922 "product_name": "Malloc disk", 00:21:55.922 "block_size": 512, 00:21:55.922 "num_blocks": 65536, 00:21:55.922 "uuid": "a4cd56b9-4569-47ca-aa06-8bd9608ce900", 00:21:55.922 "assigned_rate_limits": { 00:21:55.922 "rw_ios_per_sec": 0, 00:21:55.922 "rw_mbytes_per_sec": 0, 00:21:55.922 "r_mbytes_per_sec": 0, 00:21:55.922 "w_mbytes_per_sec": 0 00:21:55.922 }, 00:21:55.922 "claimed": true, 00:21:55.922 "claim_type": "exclusive_write", 00:21:55.922 "zoned": false, 00:21:55.922 "supported_io_types": { 00:21:55.922 "read": true, 00:21:55.922 "write": true, 00:21:55.922 "unmap": true, 00:21:55.922 "flush": true, 00:21:55.922 "reset": true, 00:21:55.922 "nvme_admin": false, 00:21:55.922 "nvme_io": false, 00:21:55.922 "nvme_io_md": false, 00:21:55.922 "write_zeroes": true, 00:21:55.922 "zcopy": true, 00:21:55.922 "get_zone_info": false, 00:21:55.922 "zone_management": false, 00:21:55.922 "zone_append": false, 00:21:55.922 "compare": false, 00:21:55.922 "compare_and_write": false, 00:21:55.922 "abort": true, 00:21:55.922 "seek_hole": false, 00:21:55.922 "seek_data": false, 00:21:55.922 "copy": true, 00:21:55.922 "nvme_iov_md": false 00:21:55.922 }, 00:21:55.922 "memory_domains": [ 00:21:55.922 { 00:21:55.922 "dma_device_id": "system", 00:21:55.922 "dma_device_type": 1 00:21:55.922 }, 00:21:55.922 { 00:21:55.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.922 "dma_device_type": 2 00:21:55.922 } 00:21:55.922 ], 00:21:55.922 "driver_specific": {} 00:21:55.922 } 00:21:55.922 ] 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.180 05:50:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:56.180 05:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:56.180 "name": "Existed_Raid", 00:21:56.180 "uuid": "8eeeef8a-f7f4-45d6-a26a-88de2eaaac5c", 00:21:56.180 "strip_size_kb": 64, 00:21:56.180 "state": "configuring", 00:21:56.180 "raid_level": "concat", 00:21:56.180 "superblock": true, 00:21:56.180 "num_base_bdevs": 4, 00:21:56.180 "num_base_bdevs_discovered": 1, 00:21:56.180 "num_base_bdevs_operational": 4, 00:21:56.180 "base_bdevs_list": [ 00:21:56.180 { 00:21:56.180 "name": "BaseBdev1", 00:21:56.180 "uuid": "a4cd56b9-4569-47ca-aa06-8bd9608ce900", 00:21:56.180 "is_configured": true, 00:21:56.180 "data_offset": 2048, 00:21:56.180 "data_size": 63488 00:21:56.180 }, 00:21:56.180 { 00:21:56.180 "name": "BaseBdev2", 00:21:56.180 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.180 "is_configured": false, 00:21:56.180 "data_offset": 0, 00:21:56.180 "data_size": 0 00:21:56.180 }, 00:21:56.180 { 00:21:56.180 "name": "BaseBdev3", 00:21:56.180 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.180 "is_configured": false, 00:21:56.180 "data_offset": 0, 00:21:56.180 "data_size": 0 00:21:56.180 }, 00:21:56.180 { 00:21:56.180 "name": "BaseBdev4", 00:21:56.180 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:56.180 "is_configured": false, 00:21:56.180 "data_offset": 0, 00:21:56.180 "data_size": 0 00:21:56.180 } 00:21:56.180 ] 00:21:56.180 }' 00:21:56.181 05:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:56.181 05:50:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:57.116 05:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:57.116 [2024-07-26 05:50:11.949974] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:57.116 [2024-07-26 05:50:11.950016] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x106f310 name Existed_Raid, state configuring 00:21:57.117 05:50:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:57.376 [2024-07-26 05:50:12.194678] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:57.376 [2024-07-26 05:50:12.196113] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:57.376 [2024-07-26 05:50:12.196146] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:57.376 [2024-07-26 05:50:12.196156] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:57.376 [2024-07-26 05:50:12.196167] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:57.376 [2024-07-26 05:50:12.196177] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:57.376 [2024-07-26 05:50:12.196187] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.376 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:57.646 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.646 "name": "Existed_Raid", 00:21:57.646 "uuid": "cf8ee40a-c4f2-4caf-a07b-2b8bff6b5e5a", 00:21:57.646 "strip_size_kb": 64, 00:21:57.646 "state": "configuring", 00:21:57.646 "raid_level": "concat", 00:21:57.646 "superblock": true, 00:21:57.646 "num_base_bdevs": 4, 00:21:57.646 "num_base_bdevs_discovered": 1, 00:21:57.647 "num_base_bdevs_operational": 4, 00:21:57.647 "base_bdevs_list": [ 00:21:57.647 { 00:21:57.647 "name": "BaseBdev1", 00:21:57.647 "uuid": "a4cd56b9-4569-47ca-aa06-8bd9608ce900", 00:21:57.647 "is_configured": true, 00:21:57.647 "data_offset": 2048, 00:21:57.647 "data_size": 63488 00:21:57.647 }, 00:21:57.647 { 00:21:57.647 "name": "BaseBdev2", 00:21:57.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.647 "is_configured": false, 00:21:57.647 "data_offset": 0, 00:21:57.647 "data_size": 0 00:21:57.647 }, 00:21:57.647 { 00:21:57.647 "name": "BaseBdev3", 00:21:57.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.647 "is_configured": false, 00:21:57.647 "data_offset": 0, 00:21:57.647 "data_size": 0 00:21:57.647 }, 00:21:57.647 { 00:21:57.647 "name": "BaseBdev4", 00:21:57.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.647 "is_configured": false, 00:21:57.647 "data_offset": 0, 00:21:57.647 "data_size": 0 00:21:57.647 } 00:21:57.647 ] 00:21:57.647 }' 00:21:57.647 05:50:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.647 05:50:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:58.213 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:58.472 [2024-07-26 05:50:13.292970] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:58.473 BaseBdev2 00:21:58.473 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:58.473 05:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:58.473 05:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:58.473 05:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:58.473 05:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:58.473 05:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:58.473 05:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:58.730 05:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:58.988 [ 00:21:58.988 { 00:21:58.988 "name": "BaseBdev2", 00:21:58.988 "aliases": [ 00:21:58.988 "af04bb76-b262-413d-88b2-cfe8939979f1" 00:21:58.988 ], 00:21:58.988 "product_name": "Malloc disk", 00:21:58.988 "block_size": 512, 00:21:58.988 "num_blocks": 65536, 00:21:58.988 "uuid": "af04bb76-b262-413d-88b2-cfe8939979f1", 00:21:58.988 "assigned_rate_limits": { 00:21:58.988 "rw_ios_per_sec": 0, 00:21:58.988 "rw_mbytes_per_sec": 0, 00:21:58.988 "r_mbytes_per_sec": 0, 00:21:58.988 "w_mbytes_per_sec": 0 00:21:58.988 }, 00:21:58.988 "claimed": true, 00:21:58.988 "claim_type": "exclusive_write", 00:21:58.988 "zoned": false, 00:21:58.988 "supported_io_types": { 00:21:58.988 "read": true, 00:21:58.988 "write": true, 00:21:58.988 "unmap": true, 00:21:58.988 "flush": true, 00:21:58.988 "reset": true, 00:21:58.988 "nvme_admin": false, 00:21:58.988 "nvme_io": false, 00:21:58.988 "nvme_io_md": false, 00:21:58.988 "write_zeroes": true, 00:21:58.988 "zcopy": true, 00:21:58.988 "get_zone_info": false, 00:21:58.988 "zone_management": false, 00:21:58.988 "zone_append": false, 00:21:58.988 "compare": false, 00:21:58.988 "compare_and_write": false, 00:21:58.988 "abort": true, 00:21:58.988 "seek_hole": false, 00:21:58.988 "seek_data": false, 00:21:58.988 "copy": true, 00:21:58.988 "nvme_iov_md": false 00:21:58.988 }, 00:21:58.988 "memory_domains": [ 00:21:58.988 { 00:21:58.988 "dma_device_id": "system", 00:21:58.988 "dma_device_type": 1 00:21:58.988 }, 00:21:58.988 { 00:21:58.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.988 "dma_device_type": 2 00:21:58.988 } 00:21:58.988 ], 00:21:58.988 "driver_specific": {} 00:21:58.988 } 00:21:58.988 ] 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.988 05:50:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:59.247 05:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.247 "name": "Existed_Raid", 00:21:59.247 "uuid": "cf8ee40a-c4f2-4caf-a07b-2b8bff6b5e5a", 00:21:59.247 "strip_size_kb": 64, 00:21:59.247 "state": "configuring", 00:21:59.247 "raid_level": "concat", 00:21:59.247 "superblock": true, 00:21:59.247 "num_base_bdevs": 4, 00:21:59.247 "num_base_bdevs_discovered": 2, 00:21:59.247 "num_base_bdevs_operational": 4, 00:21:59.247 "base_bdevs_list": [ 00:21:59.247 { 00:21:59.247 "name": "BaseBdev1", 00:21:59.247 "uuid": "a4cd56b9-4569-47ca-aa06-8bd9608ce900", 00:21:59.247 "is_configured": true, 00:21:59.247 "data_offset": 2048, 00:21:59.247 "data_size": 63488 00:21:59.247 }, 00:21:59.247 { 00:21:59.247 "name": "BaseBdev2", 00:21:59.247 "uuid": "af04bb76-b262-413d-88b2-cfe8939979f1", 00:21:59.247 "is_configured": true, 00:21:59.247 "data_offset": 2048, 00:21:59.247 "data_size": 63488 00:21:59.247 }, 00:21:59.247 { 00:21:59.247 "name": "BaseBdev3", 00:21:59.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.247 "is_configured": false, 00:21:59.247 "data_offset": 0, 00:21:59.247 "data_size": 0 00:21:59.247 }, 00:21:59.247 { 00:21:59.247 "name": "BaseBdev4", 00:21:59.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.247 "is_configured": false, 00:21:59.247 "data_offset": 0, 00:21:59.247 "data_size": 0 00:21:59.247 } 00:21:59.247 ] 00:21:59.247 }' 00:21:59.247 05:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.247 05:50:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:00.183 05:50:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:00.442 [2024-07-26 05:50:15.153367] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:00.442 BaseBdev3 00:22:00.442 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:00.442 05:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:00.442 05:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:00.442 05:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:00.442 05:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:00.442 05:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:00.442 05:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:00.700 05:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:00.958 [ 00:22:00.958 { 00:22:00.958 "name": "BaseBdev3", 00:22:00.958 "aliases": [ 00:22:00.958 "500b58a0-bbe5-41ec-8c88-f31cae07babe" 00:22:00.958 ], 00:22:00.958 "product_name": "Malloc disk", 00:22:00.958 "block_size": 512, 00:22:00.958 "num_blocks": 65536, 00:22:00.958 "uuid": "500b58a0-bbe5-41ec-8c88-f31cae07babe", 00:22:00.958 "assigned_rate_limits": { 00:22:00.958 "rw_ios_per_sec": 0, 00:22:00.958 "rw_mbytes_per_sec": 0, 00:22:00.958 "r_mbytes_per_sec": 0, 00:22:00.958 "w_mbytes_per_sec": 0 00:22:00.958 }, 00:22:00.958 "claimed": true, 00:22:00.958 "claim_type": "exclusive_write", 00:22:00.958 "zoned": false, 00:22:00.958 "supported_io_types": { 00:22:00.958 "read": true, 00:22:00.958 "write": true, 00:22:00.958 "unmap": true, 00:22:00.958 "flush": true, 00:22:00.958 "reset": true, 00:22:00.958 "nvme_admin": false, 00:22:00.958 "nvme_io": false, 00:22:00.958 "nvme_io_md": false, 00:22:00.958 "write_zeroes": true, 00:22:00.958 "zcopy": true, 00:22:00.958 "get_zone_info": false, 00:22:00.958 "zone_management": false, 00:22:00.958 "zone_append": false, 00:22:00.958 "compare": false, 00:22:00.958 "compare_and_write": false, 00:22:00.958 "abort": true, 00:22:00.958 "seek_hole": false, 00:22:00.958 "seek_data": false, 00:22:00.959 "copy": true, 00:22:00.959 "nvme_iov_md": false 00:22:00.959 }, 00:22:00.959 "memory_domains": [ 00:22:00.959 { 00:22:00.959 "dma_device_id": "system", 00:22:00.959 "dma_device_type": 1 00:22:00.959 }, 00:22:00.959 { 00:22:00.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.959 "dma_device_type": 2 00:22:00.959 } 00:22:00.959 ], 00:22:00.959 "driver_specific": {} 00:22:00.959 } 00:22:00.959 ] 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.959 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:01.217 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.217 "name": "Existed_Raid", 00:22:01.217 "uuid": "cf8ee40a-c4f2-4caf-a07b-2b8bff6b5e5a", 00:22:01.217 "strip_size_kb": 64, 00:22:01.217 "state": "configuring", 00:22:01.217 "raid_level": "concat", 00:22:01.217 "superblock": true, 00:22:01.217 "num_base_bdevs": 4, 00:22:01.217 "num_base_bdevs_discovered": 3, 00:22:01.217 "num_base_bdevs_operational": 4, 00:22:01.217 "base_bdevs_list": [ 00:22:01.217 { 00:22:01.217 "name": "BaseBdev1", 00:22:01.217 "uuid": "a4cd56b9-4569-47ca-aa06-8bd9608ce900", 00:22:01.217 "is_configured": true, 00:22:01.217 "data_offset": 2048, 00:22:01.217 "data_size": 63488 00:22:01.217 }, 00:22:01.217 { 00:22:01.217 "name": "BaseBdev2", 00:22:01.217 "uuid": "af04bb76-b262-413d-88b2-cfe8939979f1", 00:22:01.217 "is_configured": true, 00:22:01.217 "data_offset": 2048, 00:22:01.217 "data_size": 63488 00:22:01.217 }, 00:22:01.217 { 00:22:01.217 "name": "BaseBdev3", 00:22:01.217 "uuid": "500b58a0-bbe5-41ec-8c88-f31cae07babe", 00:22:01.217 "is_configured": true, 00:22:01.217 "data_offset": 2048, 00:22:01.217 "data_size": 63488 00:22:01.217 }, 00:22:01.217 { 00:22:01.217 "name": "BaseBdev4", 00:22:01.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.217 "is_configured": false, 00:22:01.217 "data_offset": 0, 00:22:01.217 "data_size": 0 00:22:01.217 } 00:22:01.217 ] 00:22:01.217 }' 00:22:01.217 05:50:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.217 05:50:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:01.784 05:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:02.043 [2024-07-26 05:50:16.753089] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:02.043 [2024-07-26 05:50:16.753263] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1070350 00:22:02.043 [2024-07-26 05:50:16.753276] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:02.043 [2024-07-26 05:50:16.753450] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1070020 00:22:02.043 [2024-07-26 05:50:16.753568] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1070350 00:22:02.043 [2024-07-26 05:50:16.753578] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1070350 00:22:02.043 [2024-07-26 05:50:16.753679] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:02.043 BaseBdev4 00:22:02.043 05:50:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:02.044 05:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:02.044 05:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:02.044 05:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:02.044 05:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:02.044 05:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:02.044 05:50:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:02.302 05:50:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:02.561 [ 00:22:02.561 { 00:22:02.561 "name": "BaseBdev4", 00:22:02.561 "aliases": [ 00:22:02.561 "853a5fbe-4d51-4931-8fb6-96a846cc0e5f" 00:22:02.561 ], 00:22:02.561 "product_name": "Malloc disk", 00:22:02.561 "block_size": 512, 00:22:02.561 "num_blocks": 65536, 00:22:02.561 "uuid": "853a5fbe-4d51-4931-8fb6-96a846cc0e5f", 00:22:02.561 "assigned_rate_limits": { 00:22:02.561 "rw_ios_per_sec": 0, 00:22:02.561 "rw_mbytes_per_sec": 0, 00:22:02.561 "r_mbytes_per_sec": 0, 00:22:02.561 "w_mbytes_per_sec": 0 00:22:02.561 }, 00:22:02.561 "claimed": true, 00:22:02.561 "claim_type": "exclusive_write", 00:22:02.561 "zoned": false, 00:22:02.561 "supported_io_types": { 00:22:02.561 "read": true, 00:22:02.561 "write": true, 00:22:02.561 "unmap": true, 00:22:02.561 "flush": true, 00:22:02.561 "reset": true, 00:22:02.561 "nvme_admin": false, 00:22:02.561 "nvme_io": false, 00:22:02.561 "nvme_io_md": false, 00:22:02.561 "write_zeroes": true, 00:22:02.561 "zcopy": true, 00:22:02.561 "get_zone_info": false, 00:22:02.561 "zone_management": false, 00:22:02.561 "zone_append": false, 00:22:02.561 "compare": false, 00:22:02.561 "compare_and_write": false, 00:22:02.561 "abort": true, 00:22:02.561 "seek_hole": false, 00:22:02.561 "seek_data": false, 00:22:02.561 "copy": true, 00:22:02.561 "nvme_iov_md": false 00:22:02.561 }, 00:22:02.561 "memory_domains": [ 00:22:02.561 { 00:22:02.561 "dma_device_id": "system", 00:22:02.561 "dma_device_type": 1 00:22:02.561 }, 00:22:02.561 { 00:22:02.561 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.561 "dma_device_type": 2 00:22:02.561 } 00:22:02.561 ], 00:22:02.561 "driver_specific": {} 00:22:02.561 } 00:22:02.561 ] 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:02.561 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.820 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:02.820 "name": "Existed_Raid", 00:22:02.820 "uuid": "cf8ee40a-c4f2-4caf-a07b-2b8bff6b5e5a", 00:22:02.820 "strip_size_kb": 64, 00:22:02.820 "state": "online", 00:22:02.820 "raid_level": "concat", 00:22:02.820 "superblock": true, 00:22:02.820 "num_base_bdevs": 4, 00:22:02.820 "num_base_bdevs_discovered": 4, 00:22:02.820 "num_base_bdevs_operational": 4, 00:22:02.820 "base_bdevs_list": [ 00:22:02.820 { 00:22:02.820 "name": "BaseBdev1", 00:22:02.820 "uuid": "a4cd56b9-4569-47ca-aa06-8bd9608ce900", 00:22:02.820 "is_configured": true, 00:22:02.820 "data_offset": 2048, 00:22:02.820 "data_size": 63488 00:22:02.820 }, 00:22:02.820 { 00:22:02.820 "name": "BaseBdev2", 00:22:02.820 "uuid": "af04bb76-b262-413d-88b2-cfe8939979f1", 00:22:02.820 "is_configured": true, 00:22:02.820 "data_offset": 2048, 00:22:02.820 "data_size": 63488 00:22:02.820 }, 00:22:02.820 { 00:22:02.820 "name": "BaseBdev3", 00:22:02.820 "uuid": "500b58a0-bbe5-41ec-8c88-f31cae07babe", 00:22:02.820 "is_configured": true, 00:22:02.820 "data_offset": 2048, 00:22:02.820 "data_size": 63488 00:22:02.820 }, 00:22:02.820 { 00:22:02.820 "name": "BaseBdev4", 00:22:02.820 "uuid": "853a5fbe-4d51-4931-8fb6-96a846cc0e5f", 00:22:02.820 "is_configured": true, 00:22:02.820 "data_offset": 2048, 00:22:02.820 "data_size": 63488 00:22:02.820 } 00:22:02.820 ] 00:22:02.820 }' 00:22:02.820 05:50:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:02.820 05:50:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:03.388 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:03.388 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:03.388 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:03.389 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:03.389 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:03.389 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:03.389 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:03.389 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:03.647 [2024-07-26 05:50:18.365688] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:03.647 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:03.647 "name": "Existed_Raid", 00:22:03.647 "aliases": [ 00:22:03.647 "cf8ee40a-c4f2-4caf-a07b-2b8bff6b5e5a" 00:22:03.647 ], 00:22:03.647 "product_name": "Raid Volume", 00:22:03.647 "block_size": 512, 00:22:03.647 "num_blocks": 253952, 00:22:03.647 "uuid": "cf8ee40a-c4f2-4caf-a07b-2b8bff6b5e5a", 00:22:03.647 "assigned_rate_limits": { 00:22:03.647 "rw_ios_per_sec": 0, 00:22:03.647 "rw_mbytes_per_sec": 0, 00:22:03.647 "r_mbytes_per_sec": 0, 00:22:03.647 "w_mbytes_per_sec": 0 00:22:03.647 }, 00:22:03.647 "claimed": false, 00:22:03.647 "zoned": false, 00:22:03.647 "supported_io_types": { 00:22:03.647 "read": true, 00:22:03.647 "write": true, 00:22:03.647 "unmap": true, 00:22:03.647 "flush": true, 00:22:03.647 "reset": true, 00:22:03.647 "nvme_admin": false, 00:22:03.647 "nvme_io": false, 00:22:03.647 "nvme_io_md": false, 00:22:03.647 "write_zeroes": true, 00:22:03.647 "zcopy": false, 00:22:03.647 "get_zone_info": false, 00:22:03.647 "zone_management": false, 00:22:03.647 "zone_append": false, 00:22:03.647 "compare": false, 00:22:03.647 "compare_and_write": false, 00:22:03.647 "abort": false, 00:22:03.647 "seek_hole": false, 00:22:03.647 "seek_data": false, 00:22:03.647 "copy": false, 00:22:03.647 "nvme_iov_md": false 00:22:03.647 }, 00:22:03.647 "memory_domains": [ 00:22:03.647 { 00:22:03.647 "dma_device_id": "system", 00:22:03.647 "dma_device_type": 1 00:22:03.647 }, 00:22:03.647 { 00:22:03.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.647 "dma_device_type": 2 00:22:03.647 }, 00:22:03.647 { 00:22:03.647 "dma_device_id": "system", 00:22:03.647 "dma_device_type": 1 00:22:03.647 }, 00:22:03.647 { 00:22:03.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.647 "dma_device_type": 2 00:22:03.647 }, 00:22:03.647 { 00:22:03.647 "dma_device_id": "system", 00:22:03.647 "dma_device_type": 1 00:22:03.647 }, 00:22:03.647 { 00:22:03.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.647 "dma_device_type": 2 00:22:03.647 }, 00:22:03.647 { 00:22:03.648 "dma_device_id": "system", 00:22:03.648 "dma_device_type": 1 00:22:03.648 }, 00:22:03.648 { 00:22:03.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.648 "dma_device_type": 2 00:22:03.648 } 00:22:03.648 ], 00:22:03.648 "driver_specific": { 00:22:03.648 "raid": { 00:22:03.648 "uuid": "cf8ee40a-c4f2-4caf-a07b-2b8bff6b5e5a", 00:22:03.648 "strip_size_kb": 64, 00:22:03.648 "state": "online", 00:22:03.648 "raid_level": "concat", 00:22:03.648 "superblock": true, 00:22:03.648 "num_base_bdevs": 4, 00:22:03.648 "num_base_bdevs_discovered": 4, 00:22:03.648 "num_base_bdevs_operational": 4, 00:22:03.648 "base_bdevs_list": [ 00:22:03.648 { 00:22:03.648 "name": "BaseBdev1", 00:22:03.648 "uuid": "a4cd56b9-4569-47ca-aa06-8bd9608ce900", 00:22:03.648 "is_configured": true, 00:22:03.648 "data_offset": 2048, 00:22:03.648 "data_size": 63488 00:22:03.648 }, 00:22:03.648 { 00:22:03.648 "name": "BaseBdev2", 00:22:03.648 "uuid": "af04bb76-b262-413d-88b2-cfe8939979f1", 00:22:03.648 "is_configured": true, 00:22:03.648 "data_offset": 2048, 00:22:03.648 "data_size": 63488 00:22:03.648 }, 00:22:03.648 { 00:22:03.648 "name": "BaseBdev3", 00:22:03.648 "uuid": "500b58a0-bbe5-41ec-8c88-f31cae07babe", 00:22:03.648 "is_configured": true, 00:22:03.648 "data_offset": 2048, 00:22:03.648 "data_size": 63488 00:22:03.648 }, 00:22:03.648 { 00:22:03.648 "name": "BaseBdev4", 00:22:03.648 "uuid": "853a5fbe-4d51-4931-8fb6-96a846cc0e5f", 00:22:03.648 "is_configured": true, 00:22:03.648 "data_offset": 2048, 00:22:03.648 "data_size": 63488 00:22:03.648 } 00:22:03.648 ] 00:22:03.648 } 00:22:03.648 } 00:22:03.648 }' 00:22:03.648 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:03.648 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:03.648 BaseBdev2 00:22:03.648 BaseBdev3 00:22:03.648 BaseBdev4' 00:22:03.648 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:03.648 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:03.648 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:03.906 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:03.906 "name": "BaseBdev1", 00:22:03.906 "aliases": [ 00:22:03.906 "a4cd56b9-4569-47ca-aa06-8bd9608ce900" 00:22:03.906 ], 00:22:03.906 "product_name": "Malloc disk", 00:22:03.906 "block_size": 512, 00:22:03.906 "num_blocks": 65536, 00:22:03.906 "uuid": "a4cd56b9-4569-47ca-aa06-8bd9608ce900", 00:22:03.906 "assigned_rate_limits": { 00:22:03.906 "rw_ios_per_sec": 0, 00:22:03.906 "rw_mbytes_per_sec": 0, 00:22:03.906 "r_mbytes_per_sec": 0, 00:22:03.906 "w_mbytes_per_sec": 0 00:22:03.906 }, 00:22:03.906 "claimed": true, 00:22:03.906 "claim_type": "exclusive_write", 00:22:03.906 "zoned": false, 00:22:03.906 "supported_io_types": { 00:22:03.906 "read": true, 00:22:03.906 "write": true, 00:22:03.906 "unmap": true, 00:22:03.906 "flush": true, 00:22:03.906 "reset": true, 00:22:03.906 "nvme_admin": false, 00:22:03.906 "nvme_io": false, 00:22:03.906 "nvme_io_md": false, 00:22:03.906 "write_zeroes": true, 00:22:03.906 "zcopy": true, 00:22:03.906 "get_zone_info": false, 00:22:03.906 "zone_management": false, 00:22:03.906 "zone_append": false, 00:22:03.906 "compare": false, 00:22:03.906 "compare_and_write": false, 00:22:03.906 "abort": true, 00:22:03.906 "seek_hole": false, 00:22:03.906 "seek_data": false, 00:22:03.906 "copy": true, 00:22:03.906 "nvme_iov_md": false 00:22:03.906 }, 00:22:03.906 "memory_domains": [ 00:22:03.906 { 00:22:03.906 "dma_device_id": "system", 00:22:03.906 "dma_device_type": 1 00:22:03.906 }, 00:22:03.906 { 00:22:03.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.906 "dma_device_type": 2 00:22:03.906 } 00:22:03.906 ], 00:22:03.906 "driver_specific": {} 00:22:03.906 }' 00:22:03.906 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.906 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.906 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:03.906 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.164 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.164 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:04.164 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.164 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.164 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:04.164 05:50:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.164 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.423 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:04.423 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:04.423 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:04.423 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:04.423 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:04.423 "name": "BaseBdev2", 00:22:04.423 "aliases": [ 00:22:04.423 "af04bb76-b262-413d-88b2-cfe8939979f1" 00:22:04.423 ], 00:22:04.423 "product_name": "Malloc disk", 00:22:04.423 "block_size": 512, 00:22:04.423 "num_blocks": 65536, 00:22:04.423 "uuid": "af04bb76-b262-413d-88b2-cfe8939979f1", 00:22:04.423 "assigned_rate_limits": { 00:22:04.423 "rw_ios_per_sec": 0, 00:22:04.423 "rw_mbytes_per_sec": 0, 00:22:04.423 "r_mbytes_per_sec": 0, 00:22:04.423 "w_mbytes_per_sec": 0 00:22:04.423 }, 00:22:04.423 "claimed": true, 00:22:04.423 "claim_type": "exclusive_write", 00:22:04.423 "zoned": false, 00:22:04.423 "supported_io_types": { 00:22:04.423 "read": true, 00:22:04.423 "write": true, 00:22:04.423 "unmap": true, 00:22:04.423 "flush": true, 00:22:04.423 "reset": true, 00:22:04.423 "nvme_admin": false, 00:22:04.423 "nvme_io": false, 00:22:04.423 "nvme_io_md": false, 00:22:04.423 "write_zeroes": true, 00:22:04.423 "zcopy": true, 00:22:04.423 "get_zone_info": false, 00:22:04.423 "zone_management": false, 00:22:04.423 "zone_append": false, 00:22:04.423 "compare": false, 00:22:04.423 "compare_and_write": false, 00:22:04.423 "abort": true, 00:22:04.423 "seek_hole": false, 00:22:04.423 "seek_data": false, 00:22:04.423 "copy": true, 00:22:04.423 "nvme_iov_md": false 00:22:04.424 }, 00:22:04.424 "memory_domains": [ 00:22:04.424 { 00:22:04.424 "dma_device_id": "system", 00:22:04.424 "dma_device_type": 1 00:22:04.424 }, 00:22:04.424 { 00:22:04.424 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.424 "dma_device_type": 2 00:22:04.424 } 00:22:04.424 ], 00:22:04.424 "driver_specific": {} 00:22:04.424 }' 00:22:04.424 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.682 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.682 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:04.682 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.682 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.682 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:04.682 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.682 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.941 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:04.941 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.941 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.941 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:04.941 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:04.941 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:04.941 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:05.199 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:05.199 "name": "BaseBdev3", 00:22:05.199 "aliases": [ 00:22:05.199 "500b58a0-bbe5-41ec-8c88-f31cae07babe" 00:22:05.199 ], 00:22:05.199 "product_name": "Malloc disk", 00:22:05.199 "block_size": 512, 00:22:05.199 "num_blocks": 65536, 00:22:05.199 "uuid": "500b58a0-bbe5-41ec-8c88-f31cae07babe", 00:22:05.199 "assigned_rate_limits": { 00:22:05.199 "rw_ios_per_sec": 0, 00:22:05.199 "rw_mbytes_per_sec": 0, 00:22:05.199 "r_mbytes_per_sec": 0, 00:22:05.199 "w_mbytes_per_sec": 0 00:22:05.199 }, 00:22:05.199 "claimed": true, 00:22:05.199 "claim_type": "exclusive_write", 00:22:05.199 "zoned": false, 00:22:05.199 "supported_io_types": { 00:22:05.199 "read": true, 00:22:05.199 "write": true, 00:22:05.199 "unmap": true, 00:22:05.199 "flush": true, 00:22:05.199 "reset": true, 00:22:05.199 "nvme_admin": false, 00:22:05.199 "nvme_io": false, 00:22:05.199 "nvme_io_md": false, 00:22:05.199 "write_zeroes": true, 00:22:05.199 "zcopy": true, 00:22:05.199 "get_zone_info": false, 00:22:05.199 "zone_management": false, 00:22:05.199 "zone_append": false, 00:22:05.199 "compare": false, 00:22:05.199 "compare_and_write": false, 00:22:05.199 "abort": true, 00:22:05.199 "seek_hole": false, 00:22:05.199 "seek_data": false, 00:22:05.199 "copy": true, 00:22:05.199 "nvme_iov_md": false 00:22:05.199 }, 00:22:05.199 "memory_domains": [ 00:22:05.199 { 00:22:05.199 "dma_device_id": "system", 00:22:05.199 "dma_device_type": 1 00:22:05.199 }, 00:22:05.199 { 00:22:05.199 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.199 "dma_device_type": 2 00:22:05.199 } 00:22:05.199 ], 00:22:05.199 "driver_specific": {} 00:22:05.199 }' 00:22:05.199 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.199 05:50:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.199 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:05.199 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.199 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.199 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:05.199 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.458 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.458 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:05.458 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.458 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.458 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:05.458 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:05.458 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:05.458 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:05.716 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:05.716 "name": "BaseBdev4", 00:22:05.716 "aliases": [ 00:22:05.716 "853a5fbe-4d51-4931-8fb6-96a846cc0e5f" 00:22:05.716 ], 00:22:05.716 "product_name": "Malloc disk", 00:22:05.716 "block_size": 512, 00:22:05.716 "num_blocks": 65536, 00:22:05.716 "uuid": "853a5fbe-4d51-4931-8fb6-96a846cc0e5f", 00:22:05.716 "assigned_rate_limits": { 00:22:05.716 "rw_ios_per_sec": 0, 00:22:05.716 "rw_mbytes_per_sec": 0, 00:22:05.716 "r_mbytes_per_sec": 0, 00:22:05.716 "w_mbytes_per_sec": 0 00:22:05.716 }, 00:22:05.717 "claimed": true, 00:22:05.717 "claim_type": "exclusive_write", 00:22:05.717 "zoned": false, 00:22:05.717 "supported_io_types": { 00:22:05.717 "read": true, 00:22:05.717 "write": true, 00:22:05.717 "unmap": true, 00:22:05.717 "flush": true, 00:22:05.717 "reset": true, 00:22:05.717 "nvme_admin": false, 00:22:05.717 "nvme_io": false, 00:22:05.717 "nvme_io_md": false, 00:22:05.717 "write_zeroes": true, 00:22:05.717 "zcopy": true, 00:22:05.717 "get_zone_info": false, 00:22:05.717 "zone_management": false, 00:22:05.717 "zone_append": false, 00:22:05.717 "compare": false, 00:22:05.717 "compare_and_write": false, 00:22:05.717 "abort": true, 00:22:05.717 "seek_hole": false, 00:22:05.717 "seek_data": false, 00:22:05.717 "copy": true, 00:22:05.717 "nvme_iov_md": false 00:22:05.717 }, 00:22:05.717 "memory_domains": [ 00:22:05.717 { 00:22:05.717 "dma_device_id": "system", 00:22:05.717 "dma_device_type": 1 00:22:05.717 }, 00:22:05.717 { 00:22:05.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.717 "dma_device_type": 2 00:22:05.717 } 00:22:05.717 ], 00:22:05.717 "driver_specific": {} 00:22:05.717 }' 00:22:05.717 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.717 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.717 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:05.717 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.975 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.976 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:05.976 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.976 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.976 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:05.976 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.976 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.976 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:05.976 05:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:06.235 [2024-07-26 05:50:21.096664] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:06.235 [2024-07-26 05:50:21.096691] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:06.235 [2024-07-26 05:50:21.096739] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:06.235 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.494 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.494 "name": "Existed_Raid", 00:22:06.494 "uuid": "cf8ee40a-c4f2-4caf-a07b-2b8bff6b5e5a", 00:22:06.494 "strip_size_kb": 64, 00:22:06.494 "state": "offline", 00:22:06.494 "raid_level": "concat", 00:22:06.494 "superblock": true, 00:22:06.494 "num_base_bdevs": 4, 00:22:06.494 "num_base_bdevs_discovered": 3, 00:22:06.494 "num_base_bdevs_operational": 3, 00:22:06.494 "base_bdevs_list": [ 00:22:06.494 { 00:22:06.494 "name": null, 00:22:06.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.494 "is_configured": false, 00:22:06.494 "data_offset": 2048, 00:22:06.494 "data_size": 63488 00:22:06.494 }, 00:22:06.494 { 00:22:06.494 "name": "BaseBdev2", 00:22:06.494 "uuid": "af04bb76-b262-413d-88b2-cfe8939979f1", 00:22:06.494 "is_configured": true, 00:22:06.494 "data_offset": 2048, 00:22:06.494 "data_size": 63488 00:22:06.494 }, 00:22:06.494 { 00:22:06.494 "name": "BaseBdev3", 00:22:06.494 "uuid": "500b58a0-bbe5-41ec-8c88-f31cae07babe", 00:22:06.494 "is_configured": true, 00:22:06.494 "data_offset": 2048, 00:22:06.494 "data_size": 63488 00:22:06.494 }, 00:22:06.494 { 00:22:06.494 "name": "BaseBdev4", 00:22:06.494 "uuid": "853a5fbe-4d51-4931-8fb6-96a846cc0e5f", 00:22:06.494 "is_configured": true, 00:22:06.494 "data_offset": 2048, 00:22:06.494 "data_size": 63488 00:22:06.494 } 00:22:06.494 ] 00:22:06.494 }' 00:22:06.494 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.494 05:50:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:07.062 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:07.062 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:07.062 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.062 05:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:07.322 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:07.322 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:07.322 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:07.582 [2024-07-26 05:50:22.361127] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:07.582 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:07.582 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:07.582 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.582 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:07.841 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:07.841 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:07.841 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:08.100 [2024-07-26 05:50:22.866884] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:08.100 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:08.100 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:08.100 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.100 05:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:08.359 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:08.359 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:08.359 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:08.929 [2024-07-26 05:50:23.629296] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:08.929 [2024-07-26 05:50:23.629341] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1070350 name Existed_Raid, state offline 00:22:08.929 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:08.929 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:08.929 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.929 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:09.188 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:09.188 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:09.188 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:09.188 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:09.188 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:09.188 05:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:09.756 BaseBdev2 00:22:09.756 05:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:09.756 05:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:09.756 05:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:09.756 05:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:09.756 05:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:09.756 05:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:09.756 05:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:10.015 05:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:10.015 [ 00:22:10.015 { 00:22:10.015 "name": "BaseBdev2", 00:22:10.015 "aliases": [ 00:22:10.015 "09257d03-4149-4900-8810-1913d590e908" 00:22:10.015 ], 00:22:10.015 "product_name": "Malloc disk", 00:22:10.015 "block_size": 512, 00:22:10.015 "num_blocks": 65536, 00:22:10.015 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:10.015 "assigned_rate_limits": { 00:22:10.015 "rw_ios_per_sec": 0, 00:22:10.015 "rw_mbytes_per_sec": 0, 00:22:10.015 "r_mbytes_per_sec": 0, 00:22:10.015 "w_mbytes_per_sec": 0 00:22:10.015 }, 00:22:10.015 "claimed": false, 00:22:10.015 "zoned": false, 00:22:10.015 "supported_io_types": { 00:22:10.015 "read": true, 00:22:10.015 "write": true, 00:22:10.015 "unmap": true, 00:22:10.015 "flush": true, 00:22:10.015 "reset": true, 00:22:10.015 "nvme_admin": false, 00:22:10.015 "nvme_io": false, 00:22:10.015 "nvme_io_md": false, 00:22:10.015 "write_zeroes": true, 00:22:10.015 "zcopy": true, 00:22:10.015 "get_zone_info": false, 00:22:10.015 "zone_management": false, 00:22:10.015 "zone_append": false, 00:22:10.015 "compare": false, 00:22:10.015 "compare_and_write": false, 00:22:10.015 "abort": true, 00:22:10.015 "seek_hole": false, 00:22:10.015 "seek_data": false, 00:22:10.015 "copy": true, 00:22:10.015 "nvme_iov_md": false 00:22:10.015 }, 00:22:10.015 "memory_domains": [ 00:22:10.015 { 00:22:10.015 "dma_device_id": "system", 00:22:10.015 "dma_device_type": 1 00:22:10.015 }, 00:22:10.015 { 00:22:10.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.015 "dma_device_type": 2 00:22:10.015 } 00:22:10.015 ], 00:22:10.015 "driver_specific": {} 00:22:10.016 } 00:22:10.016 ] 00:22:10.016 05:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:10.016 05:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:10.016 05:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:10.016 05:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:10.274 BaseBdev3 00:22:10.275 05:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:10.275 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:10.275 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:10.275 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:10.275 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:10.275 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:10.275 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:10.534 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:10.793 [ 00:22:10.793 { 00:22:10.793 "name": "BaseBdev3", 00:22:10.793 "aliases": [ 00:22:10.793 "c2b3c81e-2699-4c7b-a440-c65264f165de" 00:22:10.793 ], 00:22:10.793 "product_name": "Malloc disk", 00:22:10.793 "block_size": 512, 00:22:10.793 "num_blocks": 65536, 00:22:10.793 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:10.793 "assigned_rate_limits": { 00:22:10.793 "rw_ios_per_sec": 0, 00:22:10.793 "rw_mbytes_per_sec": 0, 00:22:10.793 "r_mbytes_per_sec": 0, 00:22:10.793 "w_mbytes_per_sec": 0 00:22:10.793 }, 00:22:10.793 "claimed": false, 00:22:10.793 "zoned": false, 00:22:10.793 "supported_io_types": { 00:22:10.793 "read": true, 00:22:10.793 "write": true, 00:22:10.793 "unmap": true, 00:22:10.793 "flush": true, 00:22:10.793 "reset": true, 00:22:10.793 "nvme_admin": false, 00:22:10.793 "nvme_io": false, 00:22:10.793 "nvme_io_md": false, 00:22:10.793 "write_zeroes": true, 00:22:10.793 "zcopy": true, 00:22:10.793 "get_zone_info": false, 00:22:10.793 "zone_management": false, 00:22:10.793 "zone_append": false, 00:22:10.793 "compare": false, 00:22:10.793 "compare_and_write": false, 00:22:10.793 "abort": true, 00:22:10.793 "seek_hole": false, 00:22:10.793 "seek_data": false, 00:22:10.793 "copy": true, 00:22:10.793 "nvme_iov_md": false 00:22:10.793 }, 00:22:10.793 "memory_domains": [ 00:22:10.793 { 00:22:10.793 "dma_device_id": "system", 00:22:10.793 "dma_device_type": 1 00:22:10.793 }, 00:22:10.793 { 00:22:10.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.793 "dma_device_type": 2 00:22:10.793 } 00:22:10.793 ], 00:22:10.793 "driver_specific": {} 00:22:10.793 } 00:22:10.793 ] 00:22:10.793 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:10.793 05:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:10.793 05:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:10.793 05:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:11.052 BaseBdev4 00:22:11.052 05:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:11.052 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:11.052 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:11.052 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:11.052 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:11.052 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:11.052 05:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:11.328 05:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:11.624 [ 00:22:11.624 { 00:22:11.624 "name": "BaseBdev4", 00:22:11.624 "aliases": [ 00:22:11.624 "dc90c97d-f5e1-4c93-b1fb-673abe51702f" 00:22:11.624 ], 00:22:11.624 "product_name": "Malloc disk", 00:22:11.624 "block_size": 512, 00:22:11.624 "num_blocks": 65536, 00:22:11.624 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:11.624 "assigned_rate_limits": { 00:22:11.624 "rw_ios_per_sec": 0, 00:22:11.624 "rw_mbytes_per_sec": 0, 00:22:11.624 "r_mbytes_per_sec": 0, 00:22:11.624 "w_mbytes_per_sec": 0 00:22:11.624 }, 00:22:11.624 "claimed": false, 00:22:11.624 "zoned": false, 00:22:11.624 "supported_io_types": { 00:22:11.624 "read": true, 00:22:11.624 "write": true, 00:22:11.624 "unmap": true, 00:22:11.624 "flush": true, 00:22:11.624 "reset": true, 00:22:11.624 "nvme_admin": false, 00:22:11.624 "nvme_io": false, 00:22:11.624 "nvme_io_md": false, 00:22:11.624 "write_zeroes": true, 00:22:11.624 "zcopy": true, 00:22:11.624 "get_zone_info": false, 00:22:11.624 "zone_management": false, 00:22:11.624 "zone_append": false, 00:22:11.624 "compare": false, 00:22:11.624 "compare_and_write": false, 00:22:11.624 "abort": true, 00:22:11.624 "seek_hole": false, 00:22:11.624 "seek_data": false, 00:22:11.624 "copy": true, 00:22:11.624 "nvme_iov_md": false 00:22:11.624 }, 00:22:11.624 "memory_domains": [ 00:22:11.624 { 00:22:11.624 "dma_device_id": "system", 00:22:11.624 "dma_device_type": 1 00:22:11.624 }, 00:22:11.624 { 00:22:11.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.624 "dma_device_type": 2 00:22:11.624 } 00:22:11.624 ], 00:22:11.624 "driver_specific": {} 00:22:11.624 } 00:22:11.624 ] 00:22:11.624 05:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:11.624 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:11.624 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:11.624 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:11.885 [2024-07-26 05:50:26.525212] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:11.885 [2024-07-26 05:50:26.525256] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:11.885 [2024-07-26 05:50:26.525278] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:11.885 [2024-07-26 05:50:26.526654] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:11.885 [2024-07-26 05:50:26.526699] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.885 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:12.143 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.143 "name": "Existed_Raid", 00:22:12.144 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:12.144 "strip_size_kb": 64, 00:22:12.144 "state": "configuring", 00:22:12.144 "raid_level": "concat", 00:22:12.144 "superblock": true, 00:22:12.144 "num_base_bdevs": 4, 00:22:12.144 "num_base_bdevs_discovered": 3, 00:22:12.144 "num_base_bdevs_operational": 4, 00:22:12.144 "base_bdevs_list": [ 00:22:12.144 { 00:22:12.144 "name": "BaseBdev1", 00:22:12.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:12.144 "is_configured": false, 00:22:12.144 "data_offset": 0, 00:22:12.144 "data_size": 0 00:22:12.144 }, 00:22:12.144 { 00:22:12.144 "name": "BaseBdev2", 00:22:12.144 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:12.144 "is_configured": true, 00:22:12.144 "data_offset": 2048, 00:22:12.144 "data_size": 63488 00:22:12.144 }, 00:22:12.144 { 00:22:12.144 "name": "BaseBdev3", 00:22:12.144 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:12.144 "is_configured": true, 00:22:12.144 "data_offset": 2048, 00:22:12.144 "data_size": 63488 00:22:12.144 }, 00:22:12.144 { 00:22:12.144 "name": "BaseBdev4", 00:22:12.144 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:12.144 "is_configured": true, 00:22:12.144 "data_offset": 2048, 00:22:12.144 "data_size": 63488 00:22:12.144 } 00:22:12.144 ] 00:22:12.144 }' 00:22:12.144 05:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.144 05:50:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:12.711 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:13.279 [2024-07-26 05:50:27.880744] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.279 05:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:13.279 05:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:13.279 "name": "Existed_Raid", 00:22:13.279 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:13.279 "strip_size_kb": 64, 00:22:13.279 "state": "configuring", 00:22:13.279 "raid_level": "concat", 00:22:13.279 "superblock": true, 00:22:13.279 "num_base_bdevs": 4, 00:22:13.279 "num_base_bdevs_discovered": 2, 00:22:13.279 "num_base_bdevs_operational": 4, 00:22:13.279 "base_bdevs_list": [ 00:22:13.279 { 00:22:13.279 "name": "BaseBdev1", 00:22:13.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.279 "is_configured": false, 00:22:13.279 "data_offset": 0, 00:22:13.279 "data_size": 0 00:22:13.279 }, 00:22:13.279 { 00:22:13.279 "name": null, 00:22:13.279 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:13.279 "is_configured": false, 00:22:13.279 "data_offset": 2048, 00:22:13.279 "data_size": 63488 00:22:13.279 }, 00:22:13.279 { 00:22:13.279 "name": "BaseBdev3", 00:22:13.279 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:13.279 "is_configured": true, 00:22:13.279 "data_offset": 2048, 00:22:13.279 "data_size": 63488 00:22:13.279 }, 00:22:13.279 { 00:22:13.279 "name": "BaseBdev4", 00:22:13.279 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:13.279 "is_configured": true, 00:22:13.279 "data_offset": 2048, 00:22:13.279 "data_size": 63488 00:22:13.279 } 00:22:13.279 ] 00:22:13.279 }' 00:22:13.279 05:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:13.279 05:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:14.217 05:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.217 05:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:14.217 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:14.217 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:14.476 [2024-07-26 05:50:29.252973] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:14.476 BaseBdev1 00:22:14.476 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:14.476 05:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:14.476 05:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:14.476 05:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:14.476 05:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:14.476 05:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:14.476 05:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:14.735 05:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:14.995 [ 00:22:14.995 { 00:22:14.995 "name": "BaseBdev1", 00:22:14.995 "aliases": [ 00:22:14.995 "1864cb04-b446-4940-adb7-9eef993dac09" 00:22:14.995 ], 00:22:14.995 "product_name": "Malloc disk", 00:22:14.995 "block_size": 512, 00:22:14.995 "num_blocks": 65536, 00:22:14.995 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:14.995 "assigned_rate_limits": { 00:22:14.995 "rw_ios_per_sec": 0, 00:22:14.995 "rw_mbytes_per_sec": 0, 00:22:14.995 "r_mbytes_per_sec": 0, 00:22:14.995 "w_mbytes_per_sec": 0 00:22:14.995 }, 00:22:14.995 "claimed": true, 00:22:14.995 "claim_type": "exclusive_write", 00:22:14.995 "zoned": false, 00:22:14.995 "supported_io_types": { 00:22:14.995 "read": true, 00:22:14.995 "write": true, 00:22:14.995 "unmap": true, 00:22:14.995 "flush": true, 00:22:14.995 "reset": true, 00:22:14.995 "nvme_admin": false, 00:22:14.995 "nvme_io": false, 00:22:14.995 "nvme_io_md": false, 00:22:14.995 "write_zeroes": true, 00:22:14.995 "zcopy": true, 00:22:14.995 "get_zone_info": false, 00:22:14.995 "zone_management": false, 00:22:14.995 "zone_append": false, 00:22:14.995 "compare": false, 00:22:14.995 "compare_and_write": false, 00:22:14.995 "abort": true, 00:22:14.995 "seek_hole": false, 00:22:14.995 "seek_data": false, 00:22:14.995 "copy": true, 00:22:14.995 "nvme_iov_md": false 00:22:14.995 }, 00:22:14.995 "memory_domains": [ 00:22:14.995 { 00:22:14.995 "dma_device_id": "system", 00:22:14.995 "dma_device_type": 1 00:22:14.995 }, 00:22:14.995 { 00:22:14.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.995 "dma_device_type": 2 00:22:14.995 } 00:22:14.995 ], 00:22:14.995 "driver_specific": {} 00:22:14.995 } 00:22:14.995 ] 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.995 05:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:15.254 05:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.254 "name": "Existed_Raid", 00:22:15.254 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:15.254 "strip_size_kb": 64, 00:22:15.254 "state": "configuring", 00:22:15.254 "raid_level": "concat", 00:22:15.254 "superblock": true, 00:22:15.254 "num_base_bdevs": 4, 00:22:15.255 "num_base_bdevs_discovered": 3, 00:22:15.255 "num_base_bdevs_operational": 4, 00:22:15.255 "base_bdevs_list": [ 00:22:15.255 { 00:22:15.255 "name": "BaseBdev1", 00:22:15.255 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:15.255 "is_configured": true, 00:22:15.255 "data_offset": 2048, 00:22:15.255 "data_size": 63488 00:22:15.255 }, 00:22:15.255 { 00:22:15.255 "name": null, 00:22:15.255 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:15.255 "is_configured": false, 00:22:15.255 "data_offset": 2048, 00:22:15.255 "data_size": 63488 00:22:15.255 }, 00:22:15.255 { 00:22:15.255 "name": "BaseBdev3", 00:22:15.255 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:15.255 "is_configured": true, 00:22:15.255 "data_offset": 2048, 00:22:15.255 "data_size": 63488 00:22:15.255 }, 00:22:15.255 { 00:22:15.255 "name": "BaseBdev4", 00:22:15.255 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:15.255 "is_configured": true, 00:22:15.255 "data_offset": 2048, 00:22:15.255 "data_size": 63488 00:22:15.255 } 00:22:15.255 ] 00:22:15.255 }' 00:22:15.255 05:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.255 05:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:15.823 05:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.823 05:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:16.082 05:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:16.082 05:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:16.340 [2024-07-26 05:50:31.065798] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.340 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:16.599 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:16.599 "name": "Existed_Raid", 00:22:16.599 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:16.599 "strip_size_kb": 64, 00:22:16.599 "state": "configuring", 00:22:16.599 "raid_level": "concat", 00:22:16.599 "superblock": true, 00:22:16.599 "num_base_bdevs": 4, 00:22:16.599 "num_base_bdevs_discovered": 2, 00:22:16.599 "num_base_bdevs_operational": 4, 00:22:16.599 "base_bdevs_list": [ 00:22:16.599 { 00:22:16.599 "name": "BaseBdev1", 00:22:16.599 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:16.599 "is_configured": true, 00:22:16.599 "data_offset": 2048, 00:22:16.599 "data_size": 63488 00:22:16.599 }, 00:22:16.599 { 00:22:16.599 "name": null, 00:22:16.599 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:16.599 "is_configured": false, 00:22:16.599 "data_offset": 2048, 00:22:16.599 "data_size": 63488 00:22:16.599 }, 00:22:16.599 { 00:22:16.599 "name": null, 00:22:16.599 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:16.599 "is_configured": false, 00:22:16.599 "data_offset": 2048, 00:22:16.599 "data_size": 63488 00:22:16.599 }, 00:22:16.599 { 00:22:16.599 "name": "BaseBdev4", 00:22:16.599 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:16.599 "is_configured": true, 00:22:16.599 "data_offset": 2048, 00:22:16.599 "data_size": 63488 00:22:16.599 } 00:22:16.599 ] 00:22:16.599 }' 00:22:16.599 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:16.599 05:50:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:17.167 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:17.167 05:50:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.426 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:17.426 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:17.685 [2024-07-26 05:50:32.445480] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.685 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:17.942 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.942 "name": "Existed_Raid", 00:22:17.942 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:17.942 "strip_size_kb": 64, 00:22:17.942 "state": "configuring", 00:22:17.942 "raid_level": "concat", 00:22:17.942 "superblock": true, 00:22:17.942 "num_base_bdevs": 4, 00:22:17.942 "num_base_bdevs_discovered": 3, 00:22:17.942 "num_base_bdevs_operational": 4, 00:22:17.943 "base_bdevs_list": [ 00:22:17.943 { 00:22:17.943 "name": "BaseBdev1", 00:22:17.943 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:17.943 "is_configured": true, 00:22:17.943 "data_offset": 2048, 00:22:17.943 "data_size": 63488 00:22:17.943 }, 00:22:17.943 { 00:22:17.943 "name": null, 00:22:17.943 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:17.943 "is_configured": false, 00:22:17.943 "data_offset": 2048, 00:22:17.943 "data_size": 63488 00:22:17.943 }, 00:22:17.943 { 00:22:17.943 "name": "BaseBdev3", 00:22:17.943 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:17.943 "is_configured": true, 00:22:17.943 "data_offset": 2048, 00:22:17.943 "data_size": 63488 00:22:17.943 }, 00:22:17.943 { 00:22:17.943 "name": "BaseBdev4", 00:22:17.943 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:17.943 "is_configured": true, 00:22:17.943 "data_offset": 2048, 00:22:17.943 "data_size": 63488 00:22:17.943 } 00:22:17.943 ] 00:22:17.943 }' 00:22:17.943 05:50:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.943 05:50:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:18.508 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.508 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:18.766 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:18.766 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:19.024 [2024-07-26 05:50:33.837204] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.024 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.025 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.025 05:50:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:19.282 05:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.282 "name": "Existed_Raid", 00:22:19.282 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:19.282 "strip_size_kb": 64, 00:22:19.282 "state": "configuring", 00:22:19.282 "raid_level": "concat", 00:22:19.282 "superblock": true, 00:22:19.282 "num_base_bdevs": 4, 00:22:19.282 "num_base_bdevs_discovered": 2, 00:22:19.283 "num_base_bdevs_operational": 4, 00:22:19.283 "base_bdevs_list": [ 00:22:19.283 { 00:22:19.283 "name": null, 00:22:19.283 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:19.283 "is_configured": false, 00:22:19.283 "data_offset": 2048, 00:22:19.283 "data_size": 63488 00:22:19.283 }, 00:22:19.283 { 00:22:19.283 "name": null, 00:22:19.283 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:19.283 "is_configured": false, 00:22:19.283 "data_offset": 2048, 00:22:19.283 "data_size": 63488 00:22:19.283 }, 00:22:19.283 { 00:22:19.283 "name": "BaseBdev3", 00:22:19.283 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:19.283 "is_configured": true, 00:22:19.283 "data_offset": 2048, 00:22:19.283 "data_size": 63488 00:22:19.283 }, 00:22:19.283 { 00:22:19.283 "name": "BaseBdev4", 00:22:19.283 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:19.283 "is_configured": true, 00:22:19.283 "data_offset": 2048, 00:22:19.283 "data_size": 63488 00:22:19.283 } 00:22:19.283 ] 00:22:19.283 }' 00:22:19.283 05:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.283 05:50:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:19.848 05:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:19.848 05:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.106 05:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:20.106 05:50:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:20.363 [2024-07-26 05:50:35.201135] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.363 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:20.621 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.621 "name": "Existed_Raid", 00:22:20.621 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:20.621 "strip_size_kb": 64, 00:22:20.621 "state": "configuring", 00:22:20.621 "raid_level": "concat", 00:22:20.621 "superblock": true, 00:22:20.621 "num_base_bdevs": 4, 00:22:20.621 "num_base_bdevs_discovered": 3, 00:22:20.621 "num_base_bdevs_operational": 4, 00:22:20.621 "base_bdevs_list": [ 00:22:20.621 { 00:22:20.621 "name": null, 00:22:20.621 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:20.621 "is_configured": false, 00:22:20.621 "data_offset": 2048, 00:22:20.621 "data_size": 63488 00:22:20.621 }, 00:22:20.621 { 00:22:20.621 "name": "BaseBdev2", 00:22:20.621 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:20.621 "is_configured": true, 00:22:20.621 "data_offset": 2048, 00:22:20.621 "data_size": 63488 00:22:20.621 }, 00:22:20.621 { 00:22:20.621 "name": "BaseBdev3", 00:22:20.621 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:20.621 "is_configured": true, 00:22:20.621 "data_offset": 2048, 00:22:20.621 "data_size": 63488 00:22:20.621 }, 00:22:20.621 { 00:22:20.621 "name": "BaseBdev4", 00:22:20.621 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:20.621 "is_configured": true, 00:22:20.621 "data_offset": 2048, 00:22:20.621 "data_size": 63488 00:22:20.621 } 00:22:20.621 ] 00:22:20.621 }' 00:22:20.621 05:50:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.621 05:50:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:21.187 05:50:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.187 05:50:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:21.445 05:50:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:21.445 05:50:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.445 05:50:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:21.702 05:50:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1864cb04-b446-4940-adb7-9eef993dac09 00:22:21.960 [2024-07-26 05:50:36.737573] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:21.960 [2024-07-26 05:50:36.737753] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1072850 00:22:21.960 [2024-07-26 05:50:36.737767] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:21.960 [2024-07-26 05:50:36.737942] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1068d80 00:22:21.960 [2024-07-26 05:50:36.738057] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1072850 00:22:21.960 [2024-07-26 05:50:36.738067] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1072850 00:22:21.960 [2024-07-26 05:50:36.738156] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:21.960 NewBaseBdev 00:22:21.960 05:50:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:21.960 05:50:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:21.960 05:50:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:21.960 05:50:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:21.960 05:50:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:21.960 05:50:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:21.960 05:50:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:22.218 05:50:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:22.476 [ 00:22:22.476 { 00:22:22.476 "name": "NewBaseBdev", 00:22:22.476 "aliases": [ 00:22:22.476 "1864cb04-b446-4940-adb7-9eef993dac09" 00:22:22.476 ], 00:22:22.476 "product_name": "Malloc disk", 00:22:22.476 "block_size": 512, 00:22:22.476 "num_blocks": 65536, 00:22:22.476 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:22.476 "assigned_rate_limits": { 00:22:22.476 "rw_ios_per_sec": 0, 00:22:22.476 "rw_mbytes_per_sec": 0, 00:22:22.476 "r_mbytes_per_sec": 0, 00:22:22.476 "w_mbytes_per_sec": 0 00:22:22.476 }, 00:22:22.476 "claimed": true, 00:22:22.476 "claim_type": "exclusive_write", 00:22:22.476 "zoned": false, 00:22:22.476 "supported_io_types": { 00:22:22.476 "read": true, 00:22:22.476 "write": true, 00:22:22.476 "unmap": true, 00:22:22.476 "flush": true, 00:22:22.476 "reset": true, 00:22:22.476 "nvme_admin": false, 00:22:22.476 "nvme_io": false, 00:22:22.476 "nvme_io_md": false, 00:22:22.476 "write_zeroes": true, 00:22:22.476 "zcopy": true, 00:22:22.476 "get_zone_info": false, 00:22:22.476 "zone_management": false, 00:22:22.476 "zone_append": false, 00:22:22.476 "compare": false, 00:22:22.476 "compare_and_write": false, 00:22:22.476 "abort": true, 00:22:22.476 "seek_hole": false, 00:22:22.476 "seek_data": false, 00:22:22.476 "copy": true, 00:22:22.476 "nvme_iov_md": false 00:22:22.476 }, 00:22:22.476 "memory_domains": [ 00:22:22.476 { 00:22:22.476 "dma_device_id": "system", 00:22:22.476 "dma_device_type": 1 00:22:22.476 }, 00:22:22.476 { 00:22:22.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:22.476 "dma_device_type": 2 00:22:22.476 } 00:22:22.476 ], 00:22:22.476 "driver_specific": {} 00:22:22.476 } 00:22:22.476 ] 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.476 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:22.733 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.733 "name": "Existed_Raid", 00:22:22.733 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:22.733 "strip_size_kb": 64, 00:22:22.733 "state": "online", 00:22:22.733 "raid_level": "concat", 00:22:22.733 "superblock": true, 00:22:22.733 "num_base_bdevs": 4, 00:22:22.733 "num_base_bdevs_discovered": 4, 00:22:22.733 "num_base_bdevs_operational": 4, 00:22:22.733 "base_bdevs_list": [ 00:22:22.733 { 00:22:22.733 "name": "NewBaseBdev", 00:22:22.733 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:22.733 "is_configured": true, 00:22:22.733 "data_offset": 2048, 00:22:22.733 "data_size": 63488 00:22:22.733 }, 00:22:22.733 { 00:22:22.733 "name": "BaseBdev2", 00:22:22.734 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:22.734 "is_configured": true, 00:22:22.734 "data_offset": 2048, 00:22:22.734 "data_size": 63488 00:22:22.734 }, 00:22:22.734 { 00:22:22.734 "name": "BaseBdev3", 00:22:22.734 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:22.734 "is_configured": true, 00:22:22.734 "data_offset": 2048, 00:22:22.734 "data_size": 63488 00:22:22.734 }, 00:22:22.734 { 00:22:22.734 "name": "BaseBdev4", 00:22:22.734 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:22.734 "is_configured": true, 00:22:22.734 "data_offset": 2048, 00:22:22.734 "data_size": 63488 00:22:22.734 } 00:22:22.734 ] 00:22:22.734 }' 00:22:22.734 05:50:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.734 05:50:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:23.297 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:23.297 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:23.297 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:23.297 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:23.297 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:23.297 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:23.297 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:23.297 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:23.555 [2024-07-26 05:50:38.286043] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:23.555 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:23.555 "name": "Existed_Raid", 00:22:23.555 "aliases": [ 00:22:23.555 "95a2961a-4eb5-4314-8794-54f4aae35062" 00:22:23.555 ], 00:22:23.555 "product_name": "Raid Volume", 00:22:23.555 "block_size": 512, 00:22:23.555 "num_blocks": 253952, 00:22:23.555 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:23.555 "assigned_rate_limits": { 00:22:23.555 "rw_ios_per_sec": 0, 00:22:23.555 "rw_mbytes_per_sec": 0, 00:22:23.555 "r_mbytes_per_sec": 0, 00:22:23.555 "w_mbytes_per_sec": 0 00:22:23.555 }, 00:22:23.555 "claimed": false, 00:22:23.555 "zoned": false, 00:22:23.555 "supported_io_types": { 00:22:23.555 "read": true, 00:22:23.555 "write": true, 00:22:23.555 "unmap": true, 00:22:23.555 "flush": true, 00:22:23.555 "reset": true, 00:22:23.555 "nvme_admin": false, 00:22:23.555 "nvme_io": false, 00:22:23.555 "nvme_io_md": false, 00:22:23.555 "write_zeroes": true, 00:22:23.555 "zcopy": false, 00:22:23.555 "get_zone_info": false, 00:22:23.555 "zone_management": false, 00:22:23.555 "zone_append": false, 00:22:23.555 "compare": false, 00:22:23.555 "compare_and_write": false, 00:22:23.555 "abort": false, 00:22:23.555 "seek_hole": false, 00:22:23.555 "seek_data": false, 00:22:23.555 "copy": false, 00:22:23.555 "nvme_iov_md": false 00:22:23.555 }, 00:22:23.555 "memory_domains": [ 00:22:23.555 { 00:22:23.555 "dma_device_id": "system", 00:22:23.555 "dma_device_type": 1 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.555 "dma_device_type": 2 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "dma_device_id": "system", 00:22:23.555 "dma_device_type": 1 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.555 "dma_device_type": 2 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "dma_device_id": "system", 00:22:23.555 "dma_device_type": 1 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.555 "dma_device_type": 2 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "dma_device_id": "system", 00:22:23.555 "dma_device_type": 1 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.555 "dma_device_type": 2 00:22:23.555 } 00:22:23.555 ], 00:22:23.555 "driver_specific": { 00:22:23.555 "raid": { 00:22:23.555 "uuid": "95a2961a-4eb5-4314-8794-54f4aae35062", 00:22:23.555 "strip_size_kb": 64, 00:22:23.555 "state": "online", 00:22:23.555 "raid_level": "concat", 00:22:23.555 "superblock": true, 00:22:23.555 "num_base_bdevs": 4, 00:22:23.555 "num_base_bdevs_discovered": 4, 00:22:23.555 "num_base_bdevs_operational": 4, 00:22:23.555 "base_bdevs_list": [ 00:22:23.555 { 00:22:23.555 "name": "NewBaseBdev", 00:22:23.555 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:23.555 "is_configured": true, 00:22:23.555 "data_offset": 2048, 00:22:23.555 "data_size": 63488 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "name": "BaseBdev2", 00:22:23.555 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:23.555 "is_configured": true, 00:22:23.555 "data_offset": 2048, 00:22:23.555 "data_size": 63488 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "name": "BaseBdev3", 00:22:23.555 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:23.555 "is_configured": true, 00:22:23.555 "data_offset": 2048, 00:22:23.555 "data_size": 63488 00:22:23.555 }, 00:22:23.555 { 00:22:23.555 "name": "BaseBdev4", 00:22:23.555 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:23.555 "is_configured": true, 00:22:23.555 "data_offset": 2048, 00:22:23.555 "data_size": 63488 00:22:23.555 } 00:22:23.555 ] 00:22:23.555 } 00:22:23.555 } 00:22:23.555 }' 00:22:23.555 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:23.555 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:23.555 BaseBdev2 00:22:23.555 BaseBdev3 00:22:23.555 BaseBdev4' 00:22:23.555 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:23.555 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:23.555 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:23.813 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:23.813 "name": "NewBaseBdev", 00:22:23.813 "aliases": [ 00:22:23.813 "1864cb04-b446-4940-adb7-9eef993dac09" 00:22:23.813 ], 00:22:23.813 "product_name": "Malloc disk", 00:22:23.813 "block_size": 512, 00:22:23.813 "num_blocks": 65536, 00:22:23.813 "uuid": "1864cb04-b446-4940-adb7-9eef993dac09", 00:22:23.813 "assigned_rate_limits": { 00:22:23.813 "rw_ios_per_sec": 0, 00:22:23.813 "rw_mbytes_per_sec": 0, 00:22:23.813 "r_mbytes_per_sec": 0, 00:22:23.813 "w_mbytes_per_sec": 0 00:22:23.813 }, 00:22:23.813 "claimed": true, 00:22:23.813 "claim_type": "exclusive_write", 00:22:23.813 "zoned": false, 00:22:23.813 "supported_io_types": { 00:22:23.813 "read": true, 00:22:23.813 "write": true, 00:22:23.813 "unmap": true, 00:22:23.813 "flush": true, 00:22:23.813 "reset": true, 00:22:23.813 "nvme_admin": false, 00:22:23.813 "nvme_io": false, 00:22:23.813 "nvme_io_md": false, 00:22:23.813 "write_zeroes": true, 00:22:23.813 "zcopy": true, 00:22:23.813 "get_zone_info": false, 00:22:23.813 "zone_management": false, 00:22:23.813 "zone_append": false, 00:22:23.813 "compare": false, 00:22:23.813 "compare_and_write": false, 00:22:23.813 "abort": true, 00:22:23.813 "seek_hole": false, 00:22:23.813 "seek_data": false, 00:22:23.813 "copy": true, 00:22:23.813 "nvme_iov_md": false 00:22:23.813 }, 00:22:23.813 "memory_domains": [ 00:22:23.813 { 00:22:23.813 "dma_device_id": "system", 00:22:23.813 "dma_device_type": 1 00:22:23.813 }, 00:22:23.813 { 00:22:23.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.813 "dma_device_type": 2 00:22:23.813 } 00:22:23.813 ], 00:22:23.813 "driver_specific": {} 00:22:23.813 }' 00:22:23.813 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:23.813 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:23.813 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:23.813 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:24.071 05:50:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.329 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:24.329 "name": "BaseBdev2", 00:22:24.329 "aliases": [ 00:22:24.329 "09257d03-4149-4900-8810-1913d590e908" 00:22:24.329 ], 00:22:24.329 "product_name": "Malloc disk", 00:22:24.329 "block_size": 512, 00:22:24.329 "num_blocks": 65536, 00:22:24.329 "uuid": "09257d03-4149-4900-8810-1913d590e908", 00:22:24.329 "assigned_rate_limits": { 00:22:24.329 "rw_ios_per_sec": 0, 00:22:24.329 "rw_mbytes_per_sec": 0, 00:22:24.329 "r_mbytes_per_sec": 0, 00:22:24.329 "w_mbytes_per_sec": 0 00:22:24.329 }, 00:22:24.329 "claimed": true, 00:22:24.329 "claim_type": "exclusive_write", 00:22:24.329 "zoned": false, 00:22:24.329 "supported_io_types": { 00:22:24.329 "read": true, 00:22:24.329 "write": true, 00:22:24.329 "unmap": true, 00:22:24.329 "flush": true, 00:22:24.329 "reset": true, 00:22:24.329 "nvme_admin": false, 00:22:24.329 "nvme_io": false, 00:22:24.329 "nvme_io_md": false, 00:22:24.329 "write_zeroes": true, 00:22:24.329 "zcopy": true, 00:22:24.329 "get_zone_info": false, 00:22:24.329 "zone_management": false, 00:22:24.329 "zone_append": false, 00:22:24.329 "compare": false, 00:22:24.329 "compare_and_write": false, 00:22:24.329 "abort": true, 00:22:24.329 "seek_hole": false, 00:22:24.329 "seek_data": false, 00:22:24.329 "copy": true, 00:22:24.329 "nvme_iov_md": false 00:22:24.329 }, 00:22:24.329 "memory_domains": [ 00:22:24.329 { 00:22:24.329 "dma_device_id": "system", 00:22:24.329 "dma_device_type": 1 00:22:24.329 }, 00:22:24.329 { 00:22:24.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.329 "dma_device_type": 2 00:22:24.329 } 00:22:24.329 ], 00:22:24.329 "driver_specific": {} 00:22:24.329 }' 00:22:24.329 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.329 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:24.587 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:24.587 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.587 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:24.587 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:24.587 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.587 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:24.587 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:24.587 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.587 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:24.845 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:24.845 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:24.845 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:24.845 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:24.845 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:24.845 "name": "BaseBdev3", 00:22:24.845 "aliases": [ 00:22:24.845 "c2b3c81e-2699-4c7b-a440-c65264f165de" 00:22:24.845 ], 00:22:24.845 "product_name": "Malloc disk", 00:22:24.845 "block_size": 512, 00:22:24.845 "num_blocks": 65536, 00:22:24.845 "uuid": "c2b3c81e-2699-4c7b-a440-c65264f165de", 00:22:24.845 "assigned_rate_limits": { 00:22:24.845 "rw_ios_per_sec": 0, 00:22:24.845 "rw_mbytes_per_sec": 0, 00:22:24.845 "r_mbytes_per_sec": 0, 00:22:24.845 "w_mbytes_per_sec": 0 00:22:24.845 }, 00:22:24.845 "claimed": true, 00:22:24.845 "claim_type": "exclusive_write", 00:22:24.845 "zoned": false, 00:22:24.845 "supported_io_types": { 00:22:24.845 "read": true, 00:22:24.845 "write": true, 00:22:24.845 "unmap": true, 00:22:24.845 "flush": true, 00:22:24.845 "reset": true, 00:22:24.845 "nvme_admin": false, 00:22:24.845 "nvme_io": false, 00:22:24.845 "nvme_io_md": false, 00:22:24.845 "write_zeroes": true, 00:22:24.845 "zcopy": true, 00:22:24.845 "get_zone_info": false, 00:22:24.845 "zone_management": false, 00:22:24.845 "zone_append": false, 00:22:24.845 "compare": false, 00:22:24.845 "compare_and_write": false, 00:22:24.845 "abort": true, 00:22:24.845 "seek_hole": false, 00:22:24.845 "seek_data": false, 00:22:24.845 "copy": true, 00:22:24.845 "nvme_iov_md": false 00:22:24.845 }, 00:22:24.845 "memory_domains": [ 00:22:24.845 { 00:22:24.845 "dma_device_id": "system", 00:22:24.845 "dma_device_type": 1 00:22:24.845 }, 00:22:24.845 { 00:22:24.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.845 "dma_device_type": 2 00:22:24.845 } 00:22:24.845 ], 00:22:24.845 "driver_specific": {} 00:22:24.845 }' 00:22:25.103 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.103 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.103 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:25.103 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.103 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.103 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.103 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.103 05:50:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.361 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:25.361 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.361 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.361 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.361 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:25.361 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:25.361 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:25.620 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:25.620 "name": "BaseBdev4", 00:22:25.620 "aliases": [ 00:22:25.620 "dc90c97d-f5e1-4c93-b1fb-673abe51702f" 00:22:25.620 ], 00:22:25.620 "product_name": "Malloc disk", 00:22:25.620 "block_size": 512, 00:22:25.620 "num_blocks": 65536, 00:22:25.620 "uuid": "dc90c97d-f5e1-4c93-b1fb-673abe51702f", 00:22:25.620 "assigned_rate_limits": { 00:22:25.620 "rw_ios_per_sec": 0, 00:22:25.620 "rw_mbytes_per_sec": 0, 00:22:25.620 "r_mbytes_per_sec": 0, 00:22:25.620 "w_mbytes_per_sec": 0 00:22:25.620 }, 00:22:25.620 "claimed": true, 00:22:25.620 "claim_type": "exclusive_write", 00:22:25.620 "zoned": false, 00:22:25.620 "supported_io_types": { 00:22:25.620 "read": true, 00:22:25.620 "write": true, 00:22:25.620 "unmap": true, 00:22:25.620 "flush": true, 00:22:25.620 "reset": true, 00:22:25.620 "nvme_admin": false, 00:22:25.620 "nvme_io": false, 00:22:25.620 "nvme_io_md": false, 00:22:25.620 "write_zeroes": true, 00:22:25.620 "zcopy": true, 00:22:25.620 "get_zone_info": false, 00:22:25.620 "zone_management": false, 00:22:25.620 "zone_append": false, 00:22:25.620 "compare": false, 00:22:25.620 "compare_and_write": false, 00:22:25.620 "abort": true, 00:22:25.620 "seek_hole": false, 00:22:25.620 "seek_data": false, 00:22:25.620 "copy": true, 00:22:25.620 "nvme_iov_md": false 00:22:25.620 }, 00:22:25.620 "memory_domains": [ 00:22:25.620 { 00:22:25.620 "dma_device_id": "system", 00:22:25.620 "dma_device_type": 1 00:22:25.620 }, 00:22:25.620 { 00:22:25.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.620 "dma_device_type": 2 00:22:25.620 } 00:22:25.620 ], 00:22:25.620 "driver_specific": {} 00:22:25.620 }' 00:22:25.620 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.620 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:25.620 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:25.620 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.620 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:25.880 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:25.880 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.880 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:25.880 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:25.880 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.880 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:25.880 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:25.880 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:26.200 [2024-07-26 05:50:40.936772] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:26.200 [2024-07-26 05:50:40.936800] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:26.200 [2024-07-26 05:50:40.936857] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:26.200 [2024-07-26 05:50:40.936919] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:26.200 [2024-07-26 05:50:40.936930] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1072850 name Existed_Raid, state offline 00:22:26.200 05:50:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1207757 00:22:26.200 05:50:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1207757 ']' 00:22:26.200 05:50:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1207757 00:22:26.200 05:50:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:22:26.200 05:50:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:26.200 05:50:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1207757 00:22:26.200 05:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:26.200 05:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:26.200 05:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1207757' 00:22:26.200 killing process with pid 1207757 00:22:26.200 05:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1207757 00:22:26.200 [2024-07-26 05:50:41.004901] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:26.200 05:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1207757 00:22:26.200 [2024-07-26 05:50:41.043302] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:26.458 05:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:26.458 00:22:26.458 real 0m33.745s 00:22:26.458 user 1m1.920s 00:22:26.458 sys 0m6.067s 00:22:26.458 05:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:26.458 05:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:26.458 ************************************ 00:22:26.458 END TEST raid_state_function_test_sb 00:22:26.458 ************************************ 00:22:26.458 05:50:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:26.458 05:50:41 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:22:26.458 05:50:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:26.458 05:50:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:26.458 05:50:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:26.458 ************************************ 00:22:26.458 START TEST raid_superblock_test 00:22:26.458 ************************************ 00:22:26.458 05:50:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:22:26.458 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:22:26.458 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:22:26.458 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:26.458 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:26.458 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:26.458 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:26.458 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1212720 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1212720 /var/tmp/spdk-raid.sock 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1212720 ']' 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:26.459 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:26.459 05:50:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:26.717 [2024-07-26 05:50:41.405676] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:22:26.717 [2024-07-26 05:50:41.405737] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1212720 ] 00:22:26.717 [2024-07-26 05:50:41.524807] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:26.976 [2024-07-26 05:50:41.631891] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:26.976 [2024-07-26 05:50:41.692780] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:26.976 [2024-07-26 05:50:41.692817] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:27.543 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:27.802 malloc1 00:22:27.802 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:28.060 [2024-07-26 05:50:42.801335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:28.060 [2024-07-26 05:50:42.801380] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:28.060 [2024-07-26 05:50:42.801402] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f2570 00:22:28.060 [2024-07-26 05:50:42.801414] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:28.060 [2024-07-26 05:50:42.803168] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:28.060 [2024-07-26 05:50:42.803196] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:28.060 pt1 00:22:28.060 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:28.061 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:28.061 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:28.061 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:28.061 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:28.061 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:28.061 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:28.061 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:28.061 05:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:28.319 malloc2 00:22:28.319 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:28.578 [2024-07-26 05:50:43.299487] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:28.578 [2024-07-26 05:50:43.299530] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:28.578 [2024-07-26 05:50:43.299548] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f3970 00:22:28.578 [2024-07-26 05:50:43.299560] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:28.578 [2024-07-26 05:50:43.301185] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:28.578 [2024-07-26 05:50:43.301214] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:28.578 pt2 00:22:28.578 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:28.578 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:28.578 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:22:28.578 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:22:28.578 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:28.578 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:28.578 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:28.578 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:28.578 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:28.837 malloc3 00:22:28.837 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:29.096 [2024-07-26 05:50:43.798625] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:29.096 [2024-07-26 05:50:43.798677] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:29.096 [2024-07-26 05:50:43.798694] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148a340 00:22:29.096 [2024-07-26 05:50:43.798707] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:29.096 [2024-07-26 05:50:43.800309] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:29.096 [2024-07-26 05:50:43.800338] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:29.096 pt3 00:22:29.096 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:29.096 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:29.096 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:22:29.096 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:22:29.096 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:29.096 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:29.096 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:29.096 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:29.096 05:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:29.355 malloc4 00:22:29.355 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:29.614 [2024-07-26 05:50:44.293751] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:29.614 [2024-07-26 05:50:44.293797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:29.614 [2024-07-26 05:50:44.293819] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148cc60 00:22:29.614 [2024-07-26 05:50:44.293832] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:29.614 [2024-07-26 05:50:44.295424] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:29.614 [2024-07-26 05:50:44.295454] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:29.614 pt4 00:22:29.614 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:29.614 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:29.614 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:29.873 [2024-07-26 05:50:44.530411] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:29.873 [2024-07-26 05:50:44.531741] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:29.873 [2024-07-26 05:50:44.531795] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:29.873 [2024-07-26 05:50:44.531839] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:29.873 [2024-07-26 05:50:44.532015] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12ea530 00:22:29.873 [2024-07-26 05:50:44.532026] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:29.873 [2024-07-26 05:50:44.532230] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12e8770 00:22:29.873 [2024-07-26 05:50:44.532377] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12ea530 00:22:29.873 [2024-07-26 05:50:44.532387] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12ea530 00:22:29.873 [2024-07-26 05:50:44.532486] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.873 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:30.132 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:30.132 "name": "raid_bdev1", 00:22:30.132 "uuid": "093bcd1c-b697-4946-9f5c-4c0f592dc6ba", 00:22:30.132 "strip_size_kb": 64, 00:22:30.132 "state": "online", 00:22:30.132 "raid_level": "concat", 00:22:30.132 "superblock": true, 00:22:30.132 "num_base_bdevs": 4, 00:22:30.132 "num_base_bdevs_discovered": 4, 00:22:30.132 "num_base_bdevs_operational": 4, 00:22:30.132 "base_bdevs_list": [ 00:22:30.132 { 00:22:30.132 "name": "pt1", 00:22:30.132 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:30.132 "is_configured": true, 00:22:30.132 "data_offset": 2048, 00:22:30.132 "data_size": 63488 00:22:30.132 }, 00:22:30.132 { 00:22:30.132 "name": "pt2", 00:22:30.132 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:30.132 "is_configured": true, 00:22:30.132 "data_offset": 2048, 00:22:30.132 "data_size": 63488 00:22:30.132 }, 00:22:30.132 { 00:22:30.132 "name": "pt3", 00:22:30.132 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:30.132 "is_configured": true, 00:22:30.132 "data_offset": 2048, 00:22:30.132 "data_size": 63488 00:22:30.132 }, 00:22:30.132 { 00:22:30.132 "name": "pt4", 00:22:30.132 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:30.132 "is_configured": true, 00:22:30.132 "data_offset": 2048, 00:22:30.132 "data_size": 63488 00:22:30.132 } 00:22:30.132 ] 00:22:30.132 }' 00:22:30.132 05:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:30.132 05:50:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:30.699 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:30.699 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:30.699 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:30.699 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:30.699 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:30.699 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:30.699 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:30.699 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:30.958 [2024-07-26 05:50:45.637610] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:30.958 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:30.958 "name": "raid_bdev1", 00:22:30.958 "aliases": [ 00:22:30.958 "093bcd1c-b697-4946-9f5c-4c0f592dc6ba" 00:22:30.958 ], 00:22:30.958 "product_name": "Raid Volume", 00:22:30.958 "block_size": 512, 00:22:30.958 "num_blocks": 253952, 00:22:30.958 "uuid": "093bcd1c-b697-4946-9f5c-4c0f592dc6ba", 00:22:30.958 "assigned_rate_limits": { 00:22:30.958 "rw_ios_per_sec": 0, 00:22:30.958 "rw_mbytes_per_sec": 0, 00:22:30.958 "r_mbytes_per_sec": 0, 00:22:30.958 "w_mbytes_per_sec": 0 00:22:30.958 }, 00:22:30.958 "claimed": false, 00:22:30.958 "zoned": false, 00:22:30.958 "supported_io_types": { 00:22:30.958 "read": true, 00:22:30.958 "write": true, 00:22:30.958 "unmap": true, 00:22:30.958 "flush": true, 00:22:30.958 "reset": true, 00:22:30.958 "nvme_admin": false, 00:22:30.958 "nvme_io": false, 00:22:30.958 "nvme_io_md": false, 00:22:30.958 "write_zeroes": true, 00:22:30.958 "zcopy": false, 00:22:30.958 "get_zone_info": false, 00:22:30.958 "zone_management": false, 00:22:30.958 "zone_append": false, 00:22:30.958 "compare": false, 00:22:30.958 "compare_and_write": false, 00:22:30.959 "abort": false, 00:22:30.959 "seek_hole": false, 00:22:30.959 "seek_data": false, 00:22:30.959 "copy": false, 00:22:30.959 "nvme_iov_md": false 00:22:30.959 }, 00:22:30.959 "memory_domains": [ 00:22:30.959 { 00:22:30.959 "dma_device_id": "system", 00:22:30.959 "dma_device_type": 1 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:30.959 "dma_device_type": 2 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "dma_device_id": "system", 00:22:30.959 "dma_device_type": 1 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:30.959 "dma_device_type": 2 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "dma_device_id": "system", 00:22:30.959 "dma_device_type": 1 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:30.959 "dma_device_type": 2 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "dma_device_id": "system", 00:22:30.959 "dma_device_type": 1 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:30.959 "dma_device_type": 2 00:22:30.959 } 00:22:30.959 ], 00:22:30.959 "driver_specific": { 00:22:30.959 "raid": { 00:22:30.959 "uuid": "093bcd1c-b697-4946-9f5c-4c0f592dc6ba", 00:22:30.959 "strip_size_kb": 64, 00:22:30.959 "state": "online", 00:22:30.959 "raid_level": "concat", 00:22:30.959 "superblock": true, 00:22:30.959 "num_base_bdevs": 4, 00:22:30.959 "num_base_bdevs_discovered": 4, 00:22:30.959 "num_base_bdevs_operational": 4, 00:22:30.959 "base_bdevs_list": [ 00:22:30.959 { 00:22:30.959 "name": "pt1", 00:22:30.959 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:30.959 "is_configured": true, 00:22:30.959 "data_offset": 2048, 00:22:30.959 "data_size": 63488 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "name": "pt2", 00:22:30.959 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:30.959 "is_configured": true, 00:22:30.959 "data_offset": 2048, 00:22:30.959 "data_size": 63488 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "name": "pt3", 00:22:30.959 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:30.959 "is_configured": true, 00:22:30.959 "data_offset": 2048, 00:22:30.959 "data_size": 63488 00:22:30.959 }, 00:22:30.959 { 00:22:30.959 "name": "pt4", 00:22:30.959 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:30.959 "is_configured": true, 00:22:30.959 "data_offset": 2048, 00:22:30.959 "data_size": 63488 00:22:30.959 } 00:22:30.959 ] 00:22:30.959 } 00:22:30.959 } 00:22:30.959 }' 00:22:30.959 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:30.959 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:30.959 pt2 00:22:30.959 pt3 00:22:30.959 pt4' 00:22:30.959 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:30.959 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:30.959 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:31.218 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:31.218 "name": "pt1", 00:22:31.218 "aliases": [ 00:22:31.218 "00000000-0000-0000-0000-000000000001" 00:22:31.218 ], 00:22:31.218 "product_name": "passthru", 00:22:31.218 "block_size": 512, 00:22:31.218 "num_blocks": 65536, 00:22:31.218 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:31.218 "assigned_rate_limits": { 00:22:31.218 "rw_ios_per_sec": 0, 00:22:31.218 "rw_mbytes_per_sec": 0, 00:22:31.218 "r_mbytes_per_sec": 0, 00:22:31.218 "w_mbytes_per_sec": 0 00:22:31.218 }, 00:22:31.218 "claimed": true, 00:22:31.218 "claim_type": "exclusive_write", 00:22:31.218 "zoned": false, 00:22:31.218 "supported_io_types": { 00:22:31.218 "read": true, 00:22:31.218 "write": true, 00:22:31.218 "unmap": true, 00:22:31.218 "flush": true, 00:22:31.218 "reset": true, 00:22:31.218 "nvme_admin": false, 00:22:31.218 "nvme_io": false, 00:22:31.218 "nvme_io_md": false, 00:22:31.218 "write_zeroes": true, 00:22:31.218 "zcopy": true, 00:22:31.218 "get_zone_info": false, 00:22:31.218 "zone_management": false, 00:22:31.218 "zone_append": false, 00:22:31.218 "compare": false, 00:22:31.218 "compare_and_write": false, 00:22:31.218 "abort": true, 00:22:31.218 "seek_hole": false, 00:22:31.218 "seek_data": false, 00:22:31.218 "copy": true, 00:22:31.218 "nvme_iov_md": false 00:22:31.218 }, 00:22:31.218 "memory_domains": [ 00:22:31.218 { 00:22:31.218 "dma_device_id": "system", 00:22:31.218 "dma_device_type": 1 00:22:31.218 }, 00:22:31.218 { 00:22:31.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:31.218 "dma_device_type": 2 00:22:31.218 } 00:22:31.218 ], 00:22:31.218 "driver_specific": { 00:22:31.218 "passthru": { 00:22:31.218 "name": "pt1", 00:22:31.218 "base_bdev_name": "malloc1" 00:22:31.218 } 00:22:31.218 } 00:22:31.218 }' 00:22:31.218 05:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:31.218 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:31.218 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:31.218 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:31.218 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:31.477 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:31.736 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:31.736 "name": "pt2", 00:22:31.736 "aliases": [ 00:22:31.736 "00000000-0000-0000-0000-000000000002" 00:22:31.736 ], 00:22:31.736 "product_name": "passthru", 00:22:31.736 "block_size": 512, 00:22:31.736 "num_blocks": 65536, 00:22:31.736 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:31.736 "assigned_rate_limits": { 00:22:31.736 "rw_ios_per_sec": 0, 00:22:31.736 "rw_mbytes_per_sec": 0, 00:22:31.736 "r_mbytes_per_sec": 0, 00:22:31.736 "w_mbytes_per_sec": 0 00:22:31.736 }, 00:22:31.736 "claimed": true, 00:22:31.736 "claim_type": "exclusive_write", 00:22:31.736 "zoned": false, 00:22:31.736 "supported_io_types": { 00:22:31.736 "read": true, 00:22:31.736 "write": true, 00:22:31.736 "unmap": true, 00:22:31.736 "flush": true, 00:22:31.736 "reset": true, 00:22:31.736 "nvme_admin": false, 00:22:31.736 "nvme_io": false, 00:22:31.736 "nvme_io_md": false, 00:22:31.736 "write_zeroes": true, 00:22:31.736 "zcopy": true, 00:22:31.736 "get_zone_info": false, 00:22:31.736 "zone_management": false, 00:22:31.736 "zone_append": false, 00:22:31.736 "compare": false, 00:22:31.736 "compare_and_write": false, 00:22:31.736 "abort": true, 00:22:31.736 "seek_hole": false, 00:22:31.736 "seek_data": false, 00:22:31.736 "copy": true, 00:22:31.736 "nvme_iov_md": false 00:22:31.736 }, 00:22:31.736 "memory_domains": [ 00:22:31.736 { 00:22:31.736 "dma_device_id": "system", 00:22:31.736 "dma_device_type": 1 00:22:31.736 }, 00:22:31.736 { 00:22:31.736 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:31.736 "dma_device_type": 2 00:22:31.736 } 00:22:31.736 ], 00:22:31.736 "driver_specific": { 00:22:31.736 "passthru": { 00:22:31.736 "name": "pt2", 00:22:31.736 "base_bdev_name": "malloc2" 00:22:31.736 } 00:22:31.736 } 00:22:31.736 }' 00:22:31.736 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:31.995 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:31.995 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:31.995 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:31.995 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:31.995 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:31.995 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:31.995 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:32.254 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:32.254 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:32.254 05:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:32.254 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:32.254 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:32.254 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:32.254 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:32.513 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:32.513 "name": "pt3", 00:22:32.513 "aliases": [ 00:22:32.513 "00000000-0000-0000-0000-000000000003" 00:22:32.513 ], 00:22:32.513 "product_name": "passthru", 00:22:32.513 "block_size": 512, 00:22:32.513 "num_blocks": 65536, 00:22:32.513 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:32.513 "assigned_rate_limits": { 00:22:32.513 "rw_ios_per_sec": 0, 00:22:32.513 "rw_mbytes_per_sec": 0, 00:22:32.513 "r_mbytes_per_sec": 0, 00:22:32.513 "w_mbytes_per_sec": 0 00:22:32.513 }, 00:22:32.513 "claimed": true, 00:22:32.513 "claim_type": "exclusive_write", 00:22:32.513 "zoned": false, 00:22:32.513 "supported_io_types": { 00:22:32.513 "read": true, 00:22:32.513 "write": true, 00:22:32.513 "unmap": true, 00:22:32.513 "flush": true, 00:22:32.513 "reset": true, 00:22:32.513 "nvme_admin": false, 00:22:32.513 "nvme_io": false, 00:22:32.513 "nvme_io_md": false, 00:22:32.513 "write_zeroes": true, 00:22:32.513 "zcopy": true, 00:22:32.513 "get_zone_info": false, 00:22:32.513 "zone_management": false, 00:22:32.513 "zone_append": false, 00:22:32.513 "compare": false, 00:22:32.513 "compare_and_write": false, 00:22:32.513 "abort": true, 00:22:32.513 "seek_hole": false, 00:22:32.513 "seek_data": false, 00:22:32.513 "copy": true, 00:22:32.513 "nvme_iov_md": false 00:22:32.513 }, 00:22:32.513 "memory_domains": [ 00:22:32.513 { 00:22:32.513 "dma_device_id": "system", 00:22:32.513 "dma_device_type": 1 00:22:32.513 }, 00:22:32.513 { 00:22:32.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:32.513 "dma_device_type": 2 00:22:32.513 } 00:22:32.513 ], 00:22:32.513 "driver_specific": { 00:22:32.513 "passthru": { 00:22:32.513 "name": "pt3", 00:22:32.513 "base_bdev_name": "malloc3" 00:22:32.513 } 00:22:32.513 } 00:22:32.513 }' 00:22:32.513 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:32.513 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:32.513 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:32.513 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:32.513 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:32.513 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:32.772 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:32.772 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:32.772 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:32.772 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:32.772 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:32.772 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:32.772 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:32.772 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:32.772 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:33.032 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:33.032 "name": "pt4", 00:22:33.032 "aliases": [ 00:22:33.032 "00000000-0000-0000-0000-000000000004" 00:22:33.032 ], 00:22:33.032 "product_name": "passthru", 00:22:33.032 "block_size": 512, 00:22:33.032 "num_blocks": 65536, 00:22:33.032 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:33.032 "assigned_rate_limits": { 00:22:33.032 "rw_ios_per_sec": 0, 00:22:33.032 "rw_mbytes_per_sec": 0, 00:22:33.032 "r_mbytes_per_sec": 0, 00:22:33.032 "w_mbytes_per_sec": 0 00:22:33.032 }, 00:22:33.032 "claimed": true, 00:22:33.032 "claim_type": "exclusive_write", 00:22:33.032 "zoned": false, 00:22:33.032 "supported_io_types": { 00:22:33.032 "read": true, 00:22:33.032 "write": true, 00:22:33.032 "unmap": true, 00:22:33.032 "flush": true, 00:22:33.032 "reset": true, 00:22:33.032 "nvme_admin": false, 00:22:33.032 "nvme_io": false, 00:22:33.032 "nvme_io_md": false, 00:22:33.032 "write_zeroes": true, 00:22:33.032 "zcopy": true, 00:22:33.032 "get_zone_info": false, 00:22:33.032 "zone_management": false, 00:22:33.032 "zone_append": false, 00:22:33.032 "compare": false, 00:22:33.032 "compare_and_write": false, 00:22:33.032 "abort": true, 00:22:33.032 "seek_hole": false, 00:22:33.032 "seek_data": false, 00:22:33.032 "copy": true, 00:22:33.032 "nvme_iov_md": false 00:22:33.032 }, 00:22:33.032 "memory_domains": [ 00:22:33.032 { 00:22:33.032 "dma_device_id": "system", 00:22:33.032 "dma_device_type": 1 00:22:33.032 }, 00:22:33.032 { 00:22:33.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.032 "dma_device_type": 2 00:22:33.032 } 00:22:33.032 ], 00:22:33.032 "driver_specific": { 00:22:33.032 "passthru": { 00:22:33.032 "name": "pt4", 00:22:33.032 "base_bdev_name": "malloc4" 00:22:33.032 } 00:22:33.032 } 00:22:33.032 }' 00:22:33.032 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.032 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:33.291 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:33.291 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.291 05:50:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:33.291 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:33.291 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:33.291 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:33.291 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:33.291 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:33.291 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:33.549 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:33.549 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:33.549 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:33.549 [2024-07-26 05:50:48.388889] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:33.549 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=093bcd1c-b697-4946-9f5c-4c0f592dc6ba 00:22:33.549 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 093bcd1c-b697-4946-9f5c-4c0f592dc6ba ']' 00:22:33.550 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:33.808 [2024-07-26 05:50:48.637250] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:33.808 [2024-07-26 05:50:48.637272] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:33.808 [2024-07-26 05:50:48.637322] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:33.808 [2024-07-26 05:50:48.637385] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:33.808 [2024-07-26 05:50:48.637396] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ea530 name raid_bdev1, state offline 00:22:33.808 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:33.808 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.067 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:34.067 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:34.067 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:34.067 05:50:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:34.326 05:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:34.326 05:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:34.585 05:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:34.585 05:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:34.843 05:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:34.843 05:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:35.102 05:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:35.102 05:50:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:35.362 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:35.621 [2024-07-26 05:50:50.353725] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:35.621 [2024-07-26 05:50:50.355115] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:35.621 [2024-07-26 05:50:50.355159] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:22:35.621 [2024-07-26 05:50:50.355192] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:22:35.621 [2024-07-26 05:50:50.355236] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:35.622 [2024-07-26 05:50:50.355274] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:35.622 [2024-07-26 05:50:50.355297] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:22:35.622 [2024-07-26 05:50:50.355318] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:22:35.622 [2024-07-26 05:50:50.355336] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:35.622 [2024-07-26 05:50:50.355352] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1495ff0 name raid_bdev1, state configuring 00:22:35.622 request: 00:22:35.622 { 00:22:35.622 "name": "raid_bdev1", 00:22:35.622 "raid_level": "concat", 00:22:35.622 "base_bdevs": [ 00:22:35.622 "malloc1", 00:22:35.622 "malloc2", 00:22:35.622 "malloc3", 00:22:35.622 "malloc4" 00:22:35.622 ], 00:22:35.622 "strip_size_kb": 64, 00:22:35.622 "superblock": false, 00:22:35.622 "method": "bdev_raid_create", 00:22:35.622 "req_id": 1 00:22:35.622 } 00:22:35.622 Got JSON-RPC error response 00:22:35.622 response: 00:22:35.622 { 00:22:35.622 "code": -17, 00:22:35.622 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:35.622 } 00:22:35.622 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:22:35.622 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:35.622 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:35.622 05:50:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:35.622 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.622 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:35.881 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:35.881 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:35.881 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:36.140 [2024-07-26 05:50:50.838995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:36.140 [2024-07-26 05:50:50.839038] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:36.140 [2024-07-26 05:50:50.839062] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f27a0 00:22:36.140 [2024-07-26 05:50:50.839074] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:36.140 [2024-07-26 05:50:50.840747] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:36.140 [2024-07-26 05:50:50.840776] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:36.140 [2024-07-26 05:50:50.840845] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:36.140 [2024-07-26 05:50:50.840871] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:36.140 pt1 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.140 05:50:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.399 05:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.399 "name": "raid_bdev1", 00:22:36.399 "uuid": "093bcd1c-b697-4946-9f5c-4c0f592dc6ba", 00:22:36.399 "strip_size_kb": 64, 00:22:36.399 "state": "configuring", 00:22:36.399 "raid_level": "concat", 00:22:36.399 "superblock": true, 00:22:36.399 "num_base_bdevs": 4, 00:22:36.399 "num_base_bdevs_discovered": 1, 00:22:36.399 "num_base_bdevs_operational": 4, 00:22:36.399 "base_bdevs_list": [ 00:22:36.399 { 00:22:36.399 "name": "pt1", 00:22:36.399 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:36.399 "is_configured": true, 00:22:36.399 "data_offset": 2048, 00:22:36.399 "data_size": 63488 00:22:36.399 }, 00:22:36.399 { 00:22:36.399 "name": null, 00:22:36.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:36.399 "is_configured": false, 00:22:36.399 "data_offset": 2048, 00:22:36.399 "data_size": 63488 00:22:36.399 }, 00:22:36.399 { 00:22:36.399 "name": null, 00:22:36.399 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:36.399 "is_configured": false, 00:22:36.399 "data_offset": 2048, 00:22:36.399 "data_size": 63488 00:22:36.399 }, 00:22:36.399 { 00:22:36.399 "name": null, 00:22:36.399 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:36.399 "is_configured": false, 00:22:36.399 "data_offset": 2048, 00:22:36.399 "data_size": 63488 00:22:36.399 } 00:22:36.399 ] 00:22:36.399 }' 00:22:36.399 05:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.399 05:50:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:36.967 05:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:22:36.967 05:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:37.226 [2024-07-26 05:50:51.925876] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:37.226 [2024-07-26 05:50:51.925923] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:37.226 [2024-07-26 05:50:51.925940] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12e9ea0 00:22:37.226 [2024-07-26 05:50:51.925952] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:37.226 [2024-07-26 05:50:51.926296] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:37.226 [2024-07-26 05:50:51.926314] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:37.226 [2024-07-26 05:50:51.926375] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:37.226 [2024-07-26 05:50:51.926393] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:37.226 pt2 00:22:37.226 05:50:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:37.485 [2024-07-26 05:50:52.170530] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.485 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.744 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.744 "name": "raid_bdev1", 00:22:37.744 "uuid": "093bcd1c-b697-4946-9f5c-4c0f592dc6ba", 00:22:37.744 "strip_size_kb": 64, 00:22:37.744 "state": "configuring", 00:22:37.744 "raid_level": "concat", 00:22:37.744 "superblock": true, 00:22:37.744 "num_base_bdevs": 4, 00:22:37.744 "num_base_bdevs_discovered": 1, 00:22:37.744 "num_base_bdevs_operational": 4, 00:22:37.744 "base_bdevs_list": [ 00:22:37.744 { 00:22:37.744 "name": "pt1", 00:22:37.744 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:37.744 "is_configured": true, 00:22:37.744 "data_offset": 2048, 00:22:37.744 "data_size": 63488 00:22:37.744 }, 00:22:37.744 { 00:22:37.744 "name": null, 00:22:37.744 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:37.745 "is_configured": false, 00:22:37.745 "data_offset": 2048, 00:22:37.745 "data_size": 63488 00:22:37.745 }, 00:22:37.745 { 00:22:37.745 "name": null, 00:22:37.745 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:37.745 "is_configured": false, 00:22:37.745 "data_offset": 2048, 00:22:37.745 "data_size": 63488 00:22:37.745 }, 00:22:37.745 { 00:22:37.745 "name": null, 00:22:37.745 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:37.745 "is_configured": false, 00:22:37.745 "data_offset": 2048, 00:22:37.745 "data_size": 63488 00:22:37.745 } 00:22:37.745 ] 00:22:37.745 }' 00:22:37.745 05:50:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.745 05:50:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:38.311 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:38.311 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:38.311 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:38.582 [2024-07-26 05:50:53.265430] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:38.582 [2024-07-26 05:50:53.265478] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:38.582 [2024-07-26 05:50:53.265496] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12e8ec0 00:22:38.582 [2024-07-26 05:50:53.265508] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:38.582 [2024-07-26 05:50:53.265843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:38.582 [2024-07-26 05:50:53.265861] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:38.582 [2024-07-26 05:50:53.265922] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:38.582 [2024-07-26 05:50:53.265941] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:38.582 pt2 00:22:38.582 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:38.582 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:38.582 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:38.842 [2024-07-26 05:50:53.514165] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:38.842 [2024-07-26 05:50:53.514207] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:38.842 [2024-07-26 05:50:53.514223] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12e90f0 00:22:38.842 [2024-07-26 05:50:53.514236] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:38.843 [2024-07-26 05:50:53.514551] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:38.843 [2024-07-26 05:50:53.514569] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:38.843 [2024-07-26 05:50:53.514628] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:38.843 [2024-07-26 05:50:53.514656] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:38.843 pt3 00:22:38.843 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:38.843 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:38.843 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:39.101 [2024-07-26 05:50:53.766833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:39.101 [2024-07-26 05:50:53.766862] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:39.101 [2024-07-26 05:50:53.766877] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f1af0 00:22:39.101 [2024-07-26 05:50:53.766888] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:39.101 [2024-07-26 05:50:53.767175] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:39.101 [2024-07-26 05:50:53.767193] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:39.101 [2024-07-26 05:50:53.767242] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:39.101 [2024-07-26 05:50:53.767259] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:39.101 [2024-07-26 05:50:53.767377] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x12eb8f0 00:22:39.101 [2024-07-26 05:50:53.767387] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:39.101 [2024-07-26 05:50:53.767550] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12eb150 00:22:39.101 [2024-07-26 05:50:53.767687] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12eb8f0 00:22:39.101 [2024-07-26 05:50:53.767697] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12eb8f0 00:22:39.101 [2024-07-26 05:50:53.767793] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:39.101 pt4 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.101 05:50:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.360 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:39.360 "name": "raid_bdev1", 00:22:39.360 "uuid": "093bcd1c-b697-4946-9f5c-4c0f592dc6ba", 00:22:39.360 "strip_size_kb": 64, 00:22:39.360 "state": "online", 00:22:39.360 "raid_level": "concat", 00:22:39.360 "superblock": true, 00:22:39.360 "num_base_bdevs": 4, 00:22:39.360 "num_base_bdevs_discovered": 4, 00:22:39.360 "num_base_bdevs_operational": 4, 00:22:39.360 "base_bdevs_list": [ 00:22:39.360 { 00:22:39.360 "name": "pt1", 00:22:39.360 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:39.360 "is_configured": true, 00:22:39.360 "data_offset": 2048, 00:22:39.360 "data_size": 63488 00:22:39.360 }, 00:22:39.360 { 00:22:39.360 "name": "pt2", 00:22:39.360 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:39.360 "is_configured": true, 00:22:39.360 "data_offset": 2048, 00:22:39.360 "data_size": 63488 00:22:39.360 }, 00:22:39.360 { 00:22:39.360 "name": "pt3", 00:22:39.360 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:39.360 "is_configured": true, 00:22:39.360 "data_offset": 2048, 00:22:39.360 "data_size": 63488 00:22:39.360 }, 00:22:39.360 { 00:22:39.360 "name": "pt4", 00:22:39.360 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:39.360 "is_configured": true, 00:22:39.360 "data_offset": 2048, 00:22:39.360 "data_size": 63488 00:22:39.360 } 00:22:39.360 ] 00:22:39.360 }' 00:22:39.360 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:39.360 05:50:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:39.930 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:39.930 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:39.930 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:39.930 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:39.930 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:39.930 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:39.930 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:39.930 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:40.224 [2024-07-26 05:50:54.870072] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:40.224 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:40.224 "name": "raid_bdev1", 00:22:40.224 "aliases": [ 00:22:40.224 "093bcd1c-b697-4946-9f5c-4c0f592dc6ba" 00:22:40.224 ], 00:22:40.224 "product_name": "Raid Volume", 00:22:40.224 "block_size": 512, 00:22:40.224 "num_blocks": 253952, 00:22:40.224 "uuid": "093bcd1c-b697-4946-9f5c-4c0f592dc6ba", 00:22:40.224 "assigned_rate_limits": { 00:22:40.224 "rw_ios_per_sec": 0, 00:22:40.224 "rw_mbytes_per_sec": 0, 00:22:40.224 "r_mbytes_per_sec": 0, 00:22:40.224 "w_mbytes_per_sec": 0 00:22:40.224 }, 00:22:40.224 "claimed": false, 00:22:40.224 "zoned": false, 00:22:40.224 "supported_io_types": { 00:22:40.224 "read": true, 00:22:40.224 "write": true, 00:22:40.224 "unmap": true, 00:22:40.224 "flush": true, 00:22:40.224 "reset": true, 00:22:40.224 "nvme_admin": false, 00:22:40.224 "nvme_io": false, 00:22:40.224 "nvme_io_md": false, 00:22:40.224 "write_zeroes": true, 00:22:40.224 "zcopy": false, 00:22:40.224 "get_zone_info": false, 00:22:40.224 "zone_management": false, 00:22:40.224 "zone_append": false, 00:22:40.224 "compare": false, 00:22:40.224 "compare_and_write": false, 00:22:40.224 "abort": false, 00:22:40.224 "seek_hole": false, 00:22:40.224 "seek_data": false, 00:22:40.224 "copy": false, 00:22:40.224 "nvme_iov_md": false 00:22:40.224 }, 00:22:40.224 "memory_domains": [ 00:22:40.224 { 00:22:40.224 "dma_device_id": "system", 00:22:40.224 "dma_device_type": 1 00:22:40.224 }, 00:22:40.224 { 00:22:40.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.224 "dma_device_type": 2 00:22:40.224 }, 00:22:40.224 { 00:22:40.224 "dma_device_id": "system", 00:22:40.224 "dma_device_type": 1 00:22:40.224 }, 00:22:40.224 { 00:22:40.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.224 "dma_device_type": 2 00:22:40.224 }, 00:22:40.224 { 00:22:40.224 "dma_device_id": "system", 00:22:40.224 "dma_device_type": 1 00:22:40.224 }, 00:22:40.224 { 00:22:40.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.224 "dma_device_type": 2 00:22:40.224 }, 00:22:40.224 { 00:22:40.224 "dma_device_id": "system", 00:22:40.224 "dma_device_type": 1 00:22:40.224 }, 00:22:40.224 { 00:22:40.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.224 "dma_device_type": 2 00:22:40.224 } 00:22:40.224 ], 00:22:40.224 "driver_specific": { 00:22:40.224 "raid": { 00:22:40.224 "uuid": "093bcd1c-b697-4946-9f5c-4c0f592dc6ba", 00:22:40.224 "strip_size_kb": 64, 00:22:40.224 "state": "online", 00:22:40.224 "raid_level": "concat", 00:22:40.224 "superblock": true, 00:22:40.224 "num_base_bdevs": 4, 00:22:40.224 "num_base_bdevs_discovered": 4, 00:22:40.224 "num_base_bdevs_operational": 4, 00:22:40.224 "base_bdevs_list": [ 00:22:40.224 { 00:22:40.224 "name": "pt1", 00:22:40.224 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:40.224 "is_configured": true, 00:22:40.224 "data_offset": 2048, 00:22:40.224 "data_size": 63488 00:22:40.224 }, 00:22:40.224 { 00:22:40.224 "name": "pt2", 00:22:40.224 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:40.224 "is_configured": true, 00:22:40.224 "data_offset": 2048, 00:22:40.224 "data_size": 63488 00:22:40.224 }, 00:22:40.224 { 00:22:40.224 "name": "pt3", 00:22:40.224 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:40.224 "is_configured": true, 00:22:40.225 "data_offset": 2048, 00:22:40.225 "data_size": 63488 00:22:40.225 }, 00:22:40.225 { 00:22:40.225 "name": "pt4", 00:22:40.225 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:40.225 "is_configured": true, 00:22:40.225 "data_offset": 2048, 00:22:40.225 "data_size": 63488 00:22:40.225 } 00:22:40.225 ] 00:22:40.225 } 00:22:40.225 } 00:22:40.225 }' 00:22:40.225 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:40.225 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:40.225 pt2 00:22:40.225 pt3 00:22:40.225 pt4' 00:22:40.225 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:40.225 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:40.225 05:50:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:40.483 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:40.483 "name": "pt1", 00:22:40.483 "aliases": [ 00:22:40.483 "00000000-0000-0000-0000-000000000001" 00:22:40.483 ], 00:22:40.483 "product_name": "passthru", 00:22:40.483 "block_size": 512, 00:22:40.483 "num_blocks": 65536, 00:22:40.483 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:40.483 "assigned_rate_limits": { 00:22:40.483 "rw_ios_per_sec": 0, 00:22:40.483 "rw_mbytes_per_sec": 0, 00:22:40.483 "r_mbytes_per_sec": 0, 00:22:40.483 "w_mbytes_per_sec": 0 00:22:40.483 }, 00:22:40.483 "claimed": true, 00:22:40.483 "claim_type": "exclusive_write", 00:22:40.483 "zoned": false, 00:22:40.483 "supported_io_types": { 00:22:40.483 "read": true, 00:22:40.483 "write": true, 00:22:40.483 "unmap": true, 00:22:40.483 "flush": true, 00:22:40.483 "reset": true, 00:22:40.483 "nvme_admin": false, 00:22:40.483 "nvme_io": false, 00:22:40.483 "nvme_io_md": false, 00:22:40.483 "write_zeroes": true, 00:22:40.483 "zcopy": true, 00:22:40.483 "get_zone_info": false, 00:22:40.483 "zone_management": false, 00:22:40.483 "zone_append": false, 00:22:40.483 "compare": false, 00:22:40.483 "compare_and_write": false, 00:22:40.483 "abort": true, 00:22:40.483 "seek_hole": false, 00:22:40.483 "seek_data": false, 00:22:40.483 "copy": true, 00:22:40.483 "nvme_iov_md": false 00:22:40.483 }, 00:22:40.483 "memory_domains": [ 00:22:40.483 { 00:22:40.483 "dma_device_id": "system", 00:22:40.483 "dma_device_type": 1 00:22:40.483 }, 00:22:40.483 { 00:22:40.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.483 "dma_device_type": 2 00:22:40.483 } 00:22:40.483 ], 00:22:40.483 "driver_specific": { 00:22:40.483 "passthru": { 00:22:40.483 "name": "pt1", 00:22:40.483 "base_bdev_name": "malloc1" 00:22:40.483 } 00:22:40.483 } 00:22:40.483 }' 00:22:40.483 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:40.483 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:40.483 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:40.483 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:40.483 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:40.483 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:40.483 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:40.741 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:40.741 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:40.741 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:40.741 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:40.741 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:40.741 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:40.741 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:40.741 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:41.000 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:41.000 "name": "pt2", 00:22:41.000 "aliases": [ 00:22:41.000 "00000000-0000-0000-0000-000000000002" 00:22:41.000 ], 00:22:41.000 "product_name": "passthru", 00:22:41.000 "block_size": 512, 00:22:41.000 "num_blocks": 65536, 00:22:41.000 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:41.000 "assigned_rate_limits": { 00:22:41.000 "rw_ios_per_sec": 0, 00:22:41.000 "rw_mbytes_per_sec": 0, 00:22:41.000 "r_mbytes_per_sec": 0, 00:22:41.000 "w_mbytes_per_sec": 0 00:22:41.000 }, 00:22:41.000 "claimed": true, 00:22:41.000 "claim_type": "exclusive_write", 00:22:41.000 "zoned": false, 00:22:41.000 "supported_io_types": { 00:22:41.000 "read": true, 00:22:41.000 "write": true, 00:22:41.000 "unmap": true, 00:22:41.000 "flush": true, 00:22:41.000 "reset": true, 00:22:41.000 "nvme_admin": false, 00:22:41.000 "nvme_io": false, 00:22:41.000 "nvme_io_md": false, 00:22:41.000 "write_zeroes": true, 00:22:41.000 "zcopy": true, 00:22:41.000 "get_zone_info": false, 00:22:41.000 "zone_management": false, 00:22:41.000 "zone_append": false, 00:22:41.000 "compare": false, 00:22:41.000 "compare_and_write": false, 00:22:41.000 "abort": true, 00:22:41.000 "seek_hole": false, 00:22:41.000 "seek_data": false, 00:22:41.000 "copy": true, 00:22:41.000 "nvme_iov_md": false 00:22:41.000 }, 00:22:41.000 "memory_domains": [ 00:22:41.000 { 00:22:41.000 "dma_device_id": "system", 00:22:41.000 "dma_device_type": 1 00:22:41.000 }, 00:22:41.000 { 00:22:41.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.000 "dma_device_type": 2 00:22:41.000 } 00:22:41.000 ], 00:22:41.000 "driver_specific": { 00:22:41.000 "passthru": { 00:22:41.000 "name": "pt2", 00:22:41.000 "base_bdev_name": "malloc2" 00:22:41.000 } 00:22:41.000 } 00:22:41.000 }' 00:22:41.000 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.000 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.000 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:41.000 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.259 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.259 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:41.259 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.259 05:50:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.259 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:41.259 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.259 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.259 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:41.259 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:41.259 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:41.259 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:41.517 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:41.517 "name": "pt3", 00:22:41.517 "aliases": [ 00:22:41.517 "00000000-0000-0000-0000-000000000003" 00:22:41.517 ], 00:22:41.517 "product_name": "passthru", 00:22:41.517 "block_size": 512, 00:22:41.517 "num_blocks": 65536, 00:22:41.517 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:41.517 "assigned_rate_limits": { 00:22:41.517 "rw_ios_per_sec": 0, 00:22:41.517 "rw_mbytes_per_sec": 0, 00:22:41.517 "r_mbytes_per_sec": 0, 00:22:41.517 "w_mbytes_per_sec": 0 00:22:41.517 }, 00:22:41.517 "claimed": true, 00:22:41.517 "claim_type": "exclusive_write", 00:22:41.517 "zoned": false, 00:22:41.517 "supported_io_types": { 00:22:41.517 "read": true, 00:22:41.517 "write": true, 00:22:41.517 "unmap": true, 00:22:41.517 "flush": true, 00:22:41.517 "reset": true, 00:22:41.517 "nvme_admin": false, 00:22:41.517 "nvme_io": false, 00:22:41.517 "nvme_io_md": false, 00:22:41.517 "write_zeroes": true, 00:22:41.517 "zcopy": true, 00:22:41.517 "get_zone_info": false, 00:22:41.517 "zone_management": false, 00:22:41.517 "zone_append": false, 00:22:41.517 "compare": false, 00:22:41.517 "compare_and_write": false, 00:22:41.517 "abort": true, 00:22:41.517 "seek_hole": false, 00:22:41.517 "seek_data": false, 00:22:41.517 "copy": true, 00:22:41.517 "nvme_iov_md": false 00:22:41.517 }, 00:22:41.517 "memory_domains": [ 00:22:41.517 { 00:22:41.517 "dma_device_id": "system", 00:22:41.517 "dma_device_type": 1 00:22:41.517 }, 00:22:41.517 { 00:22:41.517 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.517 "dma_device_type": 2 00:22:41.517 } 00:22:41.517 ], 00:22:41.517 "driver_specific": { 00:22:41.517 "passthru": { 00:22:41.517 "name": "pt3", 00:22:41.517 "base_bdev_name": "malloc3" 00:22:41.517 } 00:22:41.517 } 00:22:41.517 }' 00:22:41.517 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.517 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:41.517 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:41.517 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:41.775 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:42.033 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:42.033 "name": "pt4", 00:22:42.033 "aliases": [ 00:22:42.033 "00000000-0000-0000-0000-000000000004" 00:22:42.033 ], 00:22:42.033 "product_name": "passthru", 00:22:42.033 "block_size": 512, 00:22:42.033 "num_blocks": 65536, 00:22:42.033 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:42.033 "assigned_rate_limits": { 00:22:42.033 "rw_ios_per_sec": 0, 00:22:42.033 "rw_mbytes_per_sec": 0, 00:22:42.033 "r_mbytes_per_sec": 0, 00:22:42.033 "w_mbytes_per_sec": 0 00:22:42.033 }, 00:22:42.033 "claimed": true, 00:22:42.033 "claim_type": "exclusive_write", 00:22:42.033 "zoned": false, 00:22:42.033 "supported_io_types": { 00:22:42.033 "read": true, 00:22:42.033 "write": true, 00:22:42.033 "unmap": true, 00:22:42.033 "flush": true, 00:22:42.033 "reset": true, 00:22:42.033 "nvme_admin": false, 00:22:42.033 "nvme_io": false, 00:22:42.033 "nvme_io_md": false, 00:22:42.033 "write_zeroes": true, 00:22:42.033 "zcopy": true, 00:22:42.033 "get_zone_info": false, 00:22:42.033 "zone_management": false, 00:22:42.033 "zone_append": false, 00:22:42.033 "compare": false, 00:22:42.033 "compare_and_write": false, 00:22:42.033 "abort": true, 00:22:42.033 "seek_hole": false, 00:22:42.033 "seek_data": false, 00:22:42.033 "copy": true, 00:22:42.033 "nvme_iov_md": false 00:22:42.033 }, 00:22:42.033 "memory_domains": [ 00:22:42.033 { 00:22:42.033 "dma_device_id": "system", 00:22:42.033 "dma_device_type": 1 00:22:42.033 }, 00:22:42.033 { 00:22:42.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.033 "dma_device_type": 2 00:22:42.033 } 00:22:42.033 ], 00:22:42.033 "driver_specific": { 00:22:42.033 "passthru": { 00:22:42.033 "name": "pt4", 00:22:42.033 "base_bdev_name": "malloc4" 00:22:42.033 } 00:22:42.033 } 00:22:42.033 }' 00:22:42.033 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.292 05:50:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:42.292 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:42.292 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.292 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:42.292 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:42.292 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.292 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:42.292 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:42.292 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.550 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:42.550 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:42.550 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:42.550 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:42.809 [2024-07-26 05:50:57.485013] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 093bcd1c-b697-4946-9f5c-4c0f592dc6ba '!=' 093bcd1c-b697-4946-9f5c-4c0f592dc6ba ']' 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1212720 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1212720 ']' 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1212720 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1212720 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1212720' 00:22:42.809 killing process with pid 1212720 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1212720 00:22:42.809 [2024-07-26 05:50:57.573194] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:42.809 [2024-07-26 05:50:57.573253] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:42.809 [2024-07-26 05:50:57.573313] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:42.809 [2024-07-26 05:50:57.573325] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12eb8f0 name raid_bdev1, state offline 00:22:42.809 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1212720 00:22:42.809 [2024-07-26 05:50:57.616036] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:43.068 05:50:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:43.068 00:22:43.068 real 0m16.492s 00:22:43.068 user 0m29.781s 00:22:43.068 sys 0m2.927s 00:22:43.068 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:43.068 05:50:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:43.068 ************************************ 00:22:43.068 END TEST raid_superblock_test 00:22:43.068 ************************************ 00:22:43.068 05:50:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:43.068 05:50:57 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:22:43.068 05:50:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:43.068 05:50:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:43.068 05:50:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:43.068 ************************************ 00:22:43.068 START TEST raid_read_error_test 00:22:43.068 ************************************ 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:22:43.068 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.GdFbS1GbWw 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1215246 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1215246 /var/tmp/spdk-raid.sock 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1215246 ']' 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:43.069 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:43.069 05:50:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:43.327 [2024-07-26 05:50:57.994369] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:22:43.327 [2024-07-26 05:50:57.994434] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1215246 ] 00:22:43.327 [2024-07-26 05:50:58.123201] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:43.327 [2024-07-26 05:50:58.225093] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:43.585 [2024-07-26 05:50:58.282858] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:43.585 [2024-07-26 05:50:58.282893] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:44.152 05:50:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:44.152 05:50:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:44.152 05:50:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:44.152 05:50:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:44.410 BaseBdev1_malloc 00:22:44.410 05:50:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:44.668 true 00:22:44.669 05:50:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:45.234 [2024-07-26 05:50:59.915288] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:45.234 [2024-07-26 05:50:59.915334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:45.234 [2024-07-26 05:50:59.915355] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d3b0d0 00:22:45.234 [2024-07-26 05:50:59.915367] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:45.234 [2024-07-26 05:50:59.917220] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:45.234 [2024-07-26 05:50:59.917249] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:45.234 BaseBdev1 00:22:45.234 05:50:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:45.234 05:50:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:45.493 BaseBdev2_malloc 00:22:45.493 05:51:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:45.751 true 00:22:45.751 05:51:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:46.009 [2024-07-26 05:51:00.677900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:46.009 [2024-07-26 05:51:00.677945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:46.009 [2024-07-26 05:51:00.677966] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d3f910 00:22:46.009 [2024-07-26 05:51:00.677979] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:46.009 [2024-07-26 05:51:00.679552] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:46.009 [2024-07-26 05:51:00.679580] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:46.009 BaseBdev2 00:22:46.009 05:51:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:46.009 05:51:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:46.268 BaseBdev3_malloc 00:22:46.268 05:51:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:46.526 true 00:22:46.526 05:51:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:46.526 [2024-07-26 05:51:01.417479] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:46.526 [2024-07-26 05:51:01.417527] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:46.526 [2024-07-26 05:51:01.417547] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d41bd0 00:22:46.526 [2024-07-26 05:51:01.417560] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:46.526 [2024-07-26 05:51:01.419160] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:46.526 [2024-07-26 05:51:01.419187] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:46.526 BaseBdev3 00:22:46.784 05:51:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:46.784 05:51:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:46.784 BaseBdev4_malloc 00:22:46.784 05:51:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:47.043 true 00:22:47.043 05:51:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:47.301 [2024-07-26 05:51:02.164024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:47.301 [2024-07-26 05:51:02.164073] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:47.301 [2024-07-26 05:51:02.164096] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d42aa0 00:22:47.301 [2024-07-26 05:51:02.164109] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:47.301 [2024-07-26 05:51:02.165756] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:47.301 [2024-07-26 05:51:02.165784] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:47.301 BaseBdev4 00:22:47.301 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:47.560 [2024-07-26 05:51:02.412721] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:47.560 [2024-07-26 05:51:02.413924] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:47.560 [2024-07-26 05:51:02.413992] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:47.560 [2024-07-26 05:51:02.414054] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:47.560 [2024-07-26 05:51:02.414287] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d3cc20 00:22:47.560 [2024-07-26 05:51:02.414299] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:47.560 [2024-07-26 05:51:02.414486] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b91260 00:22:47.560 [2024-07-26 05:51:02.414630] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d3cc20 00:22:47.560 [2024-07-26 05:51:02.414648] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d3cc20 00:22:47.560 [2024-07-26 05:51:02.414749] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.560 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.819 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:47.819 "name": "raid_bdev1", 00:22:47.819 "uuid": "cd88ba4a-4ced-4640-a72b-9b46db1f78f8", 00:22:47.819 "strip_size_kb": 64, 00:22:47.819 "state": "online", 00:22:47.819 "raid_level": "concat", 00:22:47.819 "superblock": true, 00:22:47.819 "num_base_bdevs": 4, 00:22:47.819 "num_base_bdevs_discovered": 4, 00:22:47.819 "num_base_bdevs_operational": 4, 00:22:47.819 "base_bdevs_list": [ 00:22:47.819 { 00:22:47.819 "name": "BaseBdev1", 00:22:47.819 "uuid": "d0365be5-4116-5d95-8433-0943d03ebf44", 00:22:47.819 "is_configured": true, 00:22:47.819 "data_offset": 2048, 00:22:47.819 "data_size": 63488 00:22:47.819 }, 00:22:47.819 { 00:22:47.819 "name": "BaseBdev2", 00:22:47.819 "uuid": "67687ff2-8eb7-5b9c-bd75-2b04ed1081c5", 00:22:47.819 "is_configured": true, 00:22:47.819 "data_offset": 2048, 00:22:47.819 "data_size": 63488 00:22:47.819 }, 00:22:47.819 { 00:22:47.819 "name": "BaseBdev3", 00:22:47.819 "uuid": "7542933a-62ac-5ad1-adb5-b26b538f67bf", 00:22:47.819 "is_configured": true, 00:22:47.819 "data_offset": 2048, 00:22:47.819 "data_size": 63488 00:22:47.819 }, 00:22:47.819 { 00:22:47.819 "name": "BaseBdev4", 00:22:47.819 "uuid": "3a0097b5-2f21-5d06-a2b4-29042a1c554e", 00:22:47.819 "is_configured": true, 00:22:47.819 "data_offset": 2048, 00:22:47.819 "data_size": 63488 00:22:47.819 } 00:22:47.819 ] 00:22:47.819 }' 00:22:47.819 05:51:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:47.819 05:51:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:48.753 05:51:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:48.753 05:51:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:48.753 [2024-07-26 05:51:03.395581] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d2efc0 00:22:49.687 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.945 05:51:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:50.204 05:51:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:50.204 "name": "raid_bdev1", 00:22:50.204 "uuid": "cd88ba4a-4ced-4640-a72b-9b46db1f78f8", 00:22:50.204 "strip_size_kb": 64, 00:22:50.204 "state": "online", 00:22:50.204 "raid_level": "concat", 00:22:50.204 "superblock": true, 00:22:50.204 "num_base_bdevs": 4, 00:22:50.204 "num_base_bdevs_discovered": 4, 00:22:50.204 "num_base_bdevs_operational": 4, 00:22:50.204 "base_bdevs_list": [ 00:22:50.204 { 00:22:50.204 "name": "BaseBdev1", 00:22:50.204 "uuid": "d0365be5-4116-5d95-8433-0943d03ebf44", 00:22:50.204 "is_configured": true, 00:22:50.204 "data_offset": 2048, 00:22:50.204 "data_size": 63488 00:22:50.204 }, 00:22:50.204 { 00:22:50.204 "name": "BaseBdev2", 00:22:50.204 "uuid": "67687ff2-8eb7-5b9c-bd75-2b04ed1081c5", 00:22:50.204 "is_configured": true, 00:22:50.204 "data_offset": 2048, 00:22:50.204 "data_size": 63488 00:22:50.204 }, 00:22:50.204 { 00:22:50.204 "name": "BaseBdev3", 00:22:50.204 "uuid": "7542933a-62ac-5ad1-adb5-b26b538f67bf", 00:22:50.204 "is_configured": true, 00:22:50.204 "data_offset": 2048, 00:22:50.204 "data_size": 63488 00:22:50.204 }, 00:22:50.204 { 00:22:50.204 "name": "BaseBdev4", 00:22:50.204 "uuid": "3a0097b5-2f21-5d06-a2b4-29042a1c554e", 00:22:50.204 "is_configured": true, 00:22:50.204 "data_offset": 2048, 00:22:50.204 "data_size": 63488 00:22:50.204 } 00:22:50.204 ] 00:22:50.204 }' 00:22:50.204 05:51:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:50.204 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:50.770 05:51:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:51.029 [2024-07-26 05:51:05.823509] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:51.029 [2024-07-26 05:51:05.823551] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:51.029 [2024-07-26 05:51:05.826723] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:51.029 [2024-07-26 05:51:05.826761] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:51.029 [2024-07-26 05:51:05.826803] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:51.029 [2024-07-26 05:51:05.826814] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d3cc20 name raid_bdev1, state offline 00:22:51.029 0 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1215246 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1215246 ']' 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1215246 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1215246 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1215246' 00:22:51.029 killing process with pid 1215246 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1215246 00:22:51.029 [2024-07-26 05:51:05.897113] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:51.029 05:51:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1215246 00:22:51.029 [2024-07-26 05:51:05.929411] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.GdFbS1GbWw 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.41 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.41 != \0\.\0\0 ]] 00:22:51.287 00:22:51.287 real 0m8.248s 00:22:51.287 user 0m13.302s 00:22:51.287 sys 0m1.469s 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:51.287 05:51:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:51.287 ************************************ 00:22:51.287 END TEST raid_read_error_test 00:22:51.287 ************************************ 00:22:51.546 05:51:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:51.546 05:51:06 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:22:51.546 05:51:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:51.546 05:51:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:51.546 05:51:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:51.546 ************************************ 00:22:51.546 START TEST raid_write_error_test 00:22:51.546 ************************************ 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ZvcG3K1WKP 00:22:51.546 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1216918 00:22:51.547 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:51.547 05:51:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1216918 /var/tmp/spdk-raid.sock 00:22:51.547 05:51:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1216918 ']' 00:22:51.547 05:51:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:51.547 05:51:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:51.547 05:51:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:51.547 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:51.547 05:51:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:51.547 05:51:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:51.547 [2024-07-26 05:51:06.300471] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:22:51.547 [2024-07-26 05:51:06.300526] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1216918 ] 00:22:51.547 [2024-07-26 05:51:06.413606] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:51.805 [2024-07-26 05:51:06.518992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:51.805 [2024-07-26 05:51:06.576553] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:51.805 [2024-07-26 05:51:06.576586] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:52.370 05:51:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:52.370 05:51:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:52.370 05:51:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:52.370 05:51:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:52.935 BaseBdev1_malloc 00:22:52.935 05:51:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:53.193 true 00:22:53.193 05:51:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:53.760 [2024-07-26 05:51:08.438024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:53.760 [2024-07-26 05:51:08.438072] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.760 [2024-07-26 05:51:08.438094] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21390d0 00:22:53.760 [2024-07-26 05:51:08.438106] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.760 [2024-07-26 05:51:08.439966] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.760 [2024-07-26 05:51:08.439995] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:53.760 BaseBdev1 00:22:53.760 05:51:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:53.760 05:51:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:54.018 BaseBdev2_malloc 00:22:54.018 05:51:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:54.600 true 00:22:54.600 05:51:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:54.600 [2024-07-26 05:51:09.449281] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:54.600 [2024-07-26 05:51:09.449323] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.600 [2024-07-26 05:51:09.449344] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213d910 00:22:54.600 [2024-07-26 05:51:09.449356] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.600 [2024-07-26 05:51:09.450802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.600 [2024-07-26 05:51:09.450830] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:54.600 BaseBdev2 00:22:54.600 05:51:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:54.600 05:51:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:54.871 BaseBdev3_malloc 00:22:54.871 05:51:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:55.129 true 00:22:55.129 05:51:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:55.388 [2024-07-26 05:51:10.104818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:55.388 [2024-07-26 05:51:10.104869] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.388 [2024-07-26 05:51:10.104890] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213fbd0 00:22:55.388 [2024-07-26 05:51:10.104903] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.388 [2024-07-26 05:51:10.106515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.388 [2024-07-26 05:51:10.106544] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:55.388 BaseBdev3 00:22:55.388 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:55.388 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:55.646 BaseBdev4_malloc 00:22:55.646 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:55.646 true 00:22:55.646 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:55.904 [2024-07-26 05:51:10.674916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:55.904 [2024-07-26 05:51:10.674970] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.904 [2024-07-26 05:51:10.674991] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2140aa0 00:22:55.904 [2024-07-26 05:51:10.675004] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.904 [2024-07-26 05:51:10.676582] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.904 [2024-07-26 05:51:10.676610] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:55.904 BaseBdev4 00:22:55.904 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:56.162 [2024-07-26 05:51:10.835389] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:56.162 [2024-07-26 05:51:10.836723] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:56.162 [2024-07-26 05:51:10.836791] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:56.162 [2024-07-26 05:51:10.836853] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:56.162 [2024-07-26 05:51:10.837086] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x213ac20 00:22:56.162 [2024-07-26 05:51:10.837098] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:56.162 [2024-07-26 05:51:10.837298] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f8f260 00:22:56.162 [2024-07-26 05:51:10.837446] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x213ac20 00:22:56.162 [2024-07-26 05:51:10.837456] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x213ac20 00:22:56.162 [2024-07-26 05:51:10.837562] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:56.162 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:56.162 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:56.162 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:56.162 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:56.162 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:56.163 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:56.163 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.163 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.163 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.163 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.163 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.163 05:51:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.420 05:51:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.420 "name": "raid_bdev1", 00:22:56.420 "uuid": "7c2da9c9-f1e3-42c5-9e3e-6144a695a1dd", 00:22:56.420 "strip_size_kb": 64, 00:22:56.420 "state": "online", 00:22:56.420 "raid_level": "concat", 00:22:56.420 "superblock": true, 00:22:56.420 "num_base_bdevs": 4, 00:22:56.420 "num_base_bdevs_discovered": 4, 00:22:56.420 "num_base_bdevs_operational": 4, 00:22:56.420 "base_bdevs_list": [ 00:22:56.420 { 00:22:56.420 "name": "BaseBdev1", 00:22:56.420 "uuid": "297ef5a9-e95e-5d82-bed3-7779edeb3964", 00:22:56.420 "is_configured": true, 00:22:56.420 "data_offset": 2048, 00:22:56.420 "data_size": 63488 00:22:56.420 }, 00:22:56.420 { 00:22:56.420 "name": "BaseBdev2", 00:22:56.420 "uuid": "0248f1cb-c34d-58ba-b6db-5f7f597499c3", 00:22:56.420 "is_configured": true, 00:22:56.420 "data_offset": 2048, 00:22:56.420 "data_size": 63488 00:22:56.420 }, 00:22:56.420 { 00:22:56.420 "name": "BaseBdev3", 00:22:56.420 "uuid": "38c7ccc5-7976-5066-97ea-5b898a5351d3", 00:22:56.420 "is_configured": true, 00:22:56.420 "data_offset": 2048, 00:22:56.420 "data_size": 63488 00:22:56.420 }, 00:22:56.420 { 00:22:56.420 "name": "BaseBdev4", 00:22:56.420 "uuid": "f64b571f-baeb-5e62-b883-e65a11214390", 00:22:56.420 "is_configured": true, 00:22:56.420 "data_offset": 2048, 00:22:56.420 "data_size": 63488 00:22:56.421 } 00:22:56.421 ] 00:22:56.421 }' 00:22:56.421 05:51:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.421 05:51:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:56.986 05:51:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:56.986 05:51:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:56.986 [2024-07-26 05:51:11.802222] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x212cfc0 00:22:57.921 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.179 05:51:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.437 05:51:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.437 "name": "raid_bdev1", 00:22:58.437 "uuid": "7c2da9c9-f1e3-42c5-9e3e-6144a695a1dd", 00:22:58.437 "strip_size_kb": 64, 00:22:58.437 "state": "online", 00:22:58.437 "raid_level": "concat", 00:22:58.437 "superblock": true, 00:22:58.437 "num_base_bdevs": 4, 00:22:58.437 "num_base_bdevs_discovered": 4, 00:22:58.437 "num_base_bdevs_operational": 4, 00:22:58.437 "base_bdevs_list": [ 00:22:58.437 { 00:22:58.437 "name": "BaseBdev1", 00:22:58.437 "uuid": "297ef5a9-e95e-5d82-bed3-7779edeb3964", 00:22:58.437 "is_configured": true, 00:22:58.437 "data_offset": 2048, 00:22:58.437 "data_size": 63488 00:22:58.437 }, 00:22:58.437 { 00:22:58.437 "name": "BaseBdev2", 00:22:58.437 "uuid": "0248f1cb-c34d-58ba-b6db-5f7f597499c3", 00:22:58.437 "is_configured": true, 00:22:58.437 "data_offset": 2048, 00:22:58.437 "data_size": 63488 00:22:58.437 }, 00:22:58.437 { 00:22:58.437 "name": "BaseBdev3", 00:22:58.437 "uuid": "38c7ccc5-7976-5066-97ea-5b898a5351d3", 00:22:58.437 "is_configured": true, 00:22:58.437 "data_offset": 2048, 00:22:58.437 "data_size": 63488 00:22:58.437 }, 00:22:58.437 { 00:22:58.437 "name": "BaseBdev4", 00:22:58.437 "uuid": "f64b571f-baeb-5e62-b883-e65a11214390", 00:22:58.437 "is_configured": true, 00:22:58.437 "data_offset": 2048, 00:22:58.437 "data_size": 63488 00:22:58.437 } 00:22:58.437 ] 00:22:58.437 }' 00:22:58.437 05:51:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.437 05:51:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:59.002 05:51:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:59.260 [2024-07-26 05:51:14.031961] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:59.260 [2024-07-26 05:51:14.031999] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:59.260 [2024-07-26 05:51:14.035160] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:59.260 [2024-07-26 05:51:14.035198] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:59.260 [2024-07-26 05:51:14.035239] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:59.260 [2024-07-26 05:51:14.035250] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x213ac20 name raid_bdev1, state offline 00:22:59.260 0 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1216918 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1216918 ']' 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1216918 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1216918 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1216918' 00:22:59.260 killing process with pid 1216918 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1216918 00:22:59.260 [2024-07-26 05:51:14.101303] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:59.260 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1216918 00:22:59.260 [2024-07-26 05:51:14.137244] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:59.517 05:51:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ZvcG3K1WKP 00:22:59.517 05:51:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:59.517 05:51:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:59.517 05:51:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:22:59.518 05:51:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:22:59.518 05:51:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:59.518 05:51:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:59.518 05:51:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:22:59.518 00:22:59.518 real 0m8.138s 00:22:59.518 user 0m13.197s 00:22:59.518 sys 0m1.351s 00:22:59.518 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:59.518 05:51:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:59.518 ************************************ 00:22:59.518 END TEST raid_write_error_test 00:22:59.518 ************************************ 00:22:59.776 05:51:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:59.776 05:51:14 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:22:59.776 05:51:14 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:22:59.776 05:51:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:59.776 05:51:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:59.776 05:51:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:59.776 ************************************ 00:22:59.776 START TEST raid_state_function_test 00:22:59.776 ************************************ 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1218082 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1218082' 00:22:59.776 Process raid pid: 1218082 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1218082 /var/tmp/spdk-raid.sock 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1218082 ']' 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:59.776 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:59.776 05:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:59.776 [2024-07-26 05:51:14.538894] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:22:59.776 [2024-07-26 05:51:14.538958] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:59.776 [2024-07-26 05:51:14.669846] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:00.034 [2024-07-26 05:51:14.776460] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:00.034 [2024-07-26 05:51:14.844901] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:00.034 [2024-07-26 05:51:14.844933] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:00.600 05:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:00.600 05:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:23:00.600 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:00.858 [2024-07-26 05:51:15.688394] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:00.859 [2024-07-26 05:51:15.688433] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:00.859 [2024-07-26 05:51:15.688444] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:00.859 [2024-07-26 05:51:15.688455] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:00.859 [2024-07-26 05:51:15.688464] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:00.859 [2024-07-26 05:51:15.688474] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:00.859 [2024-07-26 05:51:15.688483] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:00.859 [2024-07-26 05:51:15.688494] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.859 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:01.117 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:01.117 "name": "Existed_Raid", 00:23:01.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.117 "strip_size_kb": 0, 00:23:01.117 "state": "configuring", 00:23:01.117 "raid_level": "raid1", 00:23:01.117 "superblock": false, 00:23:01.117 "num_base_bdevs": 4, 00:23:01.117 "num_base_bdevs_discovered": 0, 00:23:01.117 "num_base_bdevs_operational": 4, 00:23:01.117 "base_bdevs_list": [ 00:23:01.117 { 00:23:01.117 "name": "BaseBdev1", 00:23:01.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.117 "is_configured": false, 00:23:01.117 "data_offset": 0, 00:23:01.117 "data_size": 0 00:23:01.117 }, 00:23:01.117 { 00:23:01.117 "name": "BaseBdev2", 00:23:01.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.117 "is_configured": false, 00:23:01.117 "data_offset": 0, 00:23:01.117 "data_size": 0 00:23:01.117 }, 00:23:01.117 { 00:23:01.117 "name": "BaseBdev3", 00:23:01.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.117 "is_configured": false, 00:23:01.117 "data_offset": 0, 00:23:01.117 "data_size": 0 00:23:01.117 }, 00:23:01.117 { 00:23:01.117 "name": "BaseBdev4", 00:23:01.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.117 "is_configured": false, 00:23:01.117 "data_offset": 0, 00:23:01.117 "data_size": 0 00:23:01.117 } 00:23:01.117 ] 00:23:01.117 }' 00:23:01.117 05:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:01.117 05:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:01.683 05:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:01.940 [2024-07-26 05:51:16.791188] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:01.940 [2024-07-26 05:51:16.791218] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2493aa0 name Existed_Raid, state configuring 00:23:01.940 05:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:02.198 [2024-07-26 05:51:17.039849] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:02.199 [2024-07-26 05:51:17.039877] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:02.199 [2024-07-26 05:51:17.039887] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:02.199 [2024-07-26 05:51:17.039898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:02.199 [2024-07-26 05:51:17.039907] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:02.199 [2024-07-26 05:51:17.039918] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:02.199 [2024-07-26 05:51:17.039927] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:02.199 [2024-07-26 05:51:17.039938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:02.199 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:02.457 [2024-07-26 05:51:17.306289] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:02.457 BaseBdev1 00:23:02.457 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:02.457 05:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:23:02.457 05:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:02.457 05:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:02.457 05:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:02.457 05:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:02.457 05:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:02.716 05:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:02.974 [ 00:23:02.974 { 00:23:02.974 "name": "BaseBdev1", 00:23:02.974 "aliases": [ 00:23:02.974 "611d158a-6cf2-4b9b-be21-18da70d544f0" 00:23:02.974 ], 00:23:02.974 "product_name": "Malloc disk", 00:23:02.974 "block_size": 512, 00:23:02.974 "num_blocks": 65536, 00:23:02.974 "uuid": "611d158a-6cf2-4b9b-be21-18da70d544f0", 00:23:02.974 "assigned_rate_limits": { 00:23:02.974 "rw_ios_per_sec": 0, 00:23:02.974 "rw_mbytes_per_sec": 0, 00:23:02.974 "r_mbytes_per_sec": 0, 00:23:02.974 "w_mbytes_per_sec": 0 00:23:02.974 }, 00:23:02.974 "claimed": true, 00:23:02.974 "claim_type": "exclusive_write", 00:23:02.974 "zoned": false, 00:23:02.974 "supported_io_types": { 00:23:02.974 "read": true, 00:23:02.974 "write": true, 00:23:02.974 "unmap": true, 00:23:02.974 "flush": true, 00:23:02.974 "reset": true, 00:23:02.974 "nvme_admin": false, 00:23:02.974 "nvme_io": false, 00:23:02.974 "nvme_io_md": false, 00:23:02.974 "write_zeroes": true, 00:23:02.974 "zcopy": true, 00:23:02.974 "get_zone_info": false, 00:23:02.974 "zone_management": false, 00:23:02.974 "zone_append": false, 00:23:02.974 "compare": false, 00:23:02.974 "compare_and_write": false, 00:23:02.974 "abort": true, 00:23:02.974 "seek_hole": false, 00:23:02.974 "seek_data": false, 00:23:02.974 "copy": true, 00:23:02.974 "nvme_iov_md": false 00:23:02.974 }, 00:23:02.974 "memory_domains": [ 00:23:02.974 { 00:23:02.974 "dma_device_id": "system", 00:23:02.974 "dma_device_type": 1 00:23:02.974 }, 00:23:02.974 { 00:23:02.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:02.974 "dma_device_type": 2 00:23:02.974 } 00:23:02.974 ], 00:23:02.974 "driver_specific": {} 00:23:02.974 } 00:23:02.974 ] 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.974 05:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:03.233 05:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:03.233 "name": "Existed_Raid", 00:23:03.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:03.233 "strip_size_kb": 0, 00:23:03.233 "state": "configuring", 00:23:03.233 "raid_level": "raid1", 00:23:03.233 "superblock": false, 00:23:03.233 "num_base_bdevs": 4, 00:23:03.233 "num_base_bdevs_discovered": 1, 00:23:03.233 "num_base_bdevs_operational": 4, 00:23:03.233 "base_bdevs_list": [ 00:23:03.233 { 00:23:03.233 "name": "BaseBdev1", 00:23:03.233 "uuid": "611d158a-6cf2-4b9b-be21-18da70d544f0", 00:23:03.233 "is_configured": true, 00:23:03.233 "data_offset": 0, 00:23:03.233 "data_size": 65536 00:23:03.233 }, 00:23:03.233 { 00:23:03.233 "name": "BaseBdev2", 00:23:03.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:03.233 "is_configured": false, 00:23:03.233 "data_offset": 0, 00:23:03.233 "data_size": 0 00:23:03.233 }, 00:23:03.233 { 00:23:03.233 "name": "BaseBdev3", 00:23:03.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:03.233 "is_configured": false, 00:23:03.233 "data_offset": 0, 00:23:03.233 "data_size": 0 00:23:03.233 }, 00:23:03.233 { 00:23:03.233 "name": "BaseBdev4", 00:23:03.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:03.233 "is_configured": false, 00:23:03.233 "data_offset": 0, 00:23:03.233 "data_size": 0 00:23:03.233 } 00:23:03.233 ] 00:23:03.233 }' 00:23:03.233 05:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:03.233 05:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:03.799 05:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:04.058 [2024-07-26 05:51:18.898512] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:04.058 [2024-07-26 05:51:18.898553] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2493310 name Existed_Raid, state configuring 00:23:04.058 05:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:04.316 [2024-07-26 05:51:19.143183] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:04.316 [2024-07-26 05:51:19.144753] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:04.316 [2024-07-26 05:51:19.144785] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:04.316 [2024-07-26 05:51:19.144796] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:04.316 [2024-07-26 05:51:19.144807] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:04.316 [2024-07-26 05:51:19.144816] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:04.316 [2024-07-26 05:51:19.144827] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.316 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:04.575 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.575 "name": "Existed_Raid", 00:23:04.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.575 "strip_size_kb": 0, 00:23:04.575 "state": "configuring", 00:23:04.575 "raid_level": "raid1", 00:23:04.575 "superblock": false, 00:23:04.575 "num_base_bdevs": 4, 00:23:04.575 "num_base_bdevs_discovered": 1, 00:23:04.575 "num_base_bdevs_operational": 4, 00:23:04.575 "base_bdevs_list": [ 00:23:04.575 { 00:23:04.575 "name": "BaseBdev1", 00:23:04.575 "uuid": "611d158a-6cf2-4b9b-be21-18da70d544f0", 00:23:04.575 "is_configured": true, 00:23:04.575 "data_offset": 0, 00:23:04.575 "data_size": 65536 00:23:04.575 }, 00:23:04.575 { 00:23:04.575 "name": "BaseBdev2", 00:23:04.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.575 "is_configured": false, 00:23:04.575 "data_offset": 0, 00:23:04.575 "data_size": 0 00:23:04.575 }, 00:23:04.575 { 00:23:04.575 "name": "BaseBdev3", 00:23:04.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.575 "is_configured": false, 00:23:04.575 "data_offset": 0, 00:23:04.575 "data_size": 0 00:23:04.575 }, 00:23:04.575 { 00:23:04.575 "name": "BaseBdev4", 00:23:04.575 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.575 "is_configured": false, 00:23:04.575 "data_offset": 0, 00:23:04.575 "data_size": 0 00:23:04.575 } 00:23:04.575 ] 00:23:04.575 }' 00:23:04.575 05:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.575 05:51:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:05.142 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:05.401 [2024-07-26 05:51:20.237462] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:05.401 BaseBdev2 00:23:05.401 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:05.401 05:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:23:05.401 05:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:05.401 05:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:05.401 05:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:05.401 05:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:05.401 05:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:05.659 05:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:05.918 [ 00:23:05.918 { 00:23:05.918 "name": "BaseBdev2", 00:23:05.918 "aliases": [ 00:23:05.918 "ee1d5154-3a51-4b26-b6b1-efea8e86b386" 00:23:05.918 ], 00:23:05.918 "product_name": "Malloc disk", 00:23:05.918 "block_size": 512, 00:23:05.918 "num_blocks": 65536, 00:23:05.918 "uuid": "ee1d5154-3a51-4b26-b6b1-efea8e86b386", 00:23:05.918 "assigned_rate_limits": { 00:23:05.918 "rw_ios_per_sec": 0, 00:23:05.918 "rw_mbytes_per_sec": 0, 00:23:05.918 "r_mbytes_per_sec": 0, 00:23:05.918 "w_mbytes_per_sec": 0 00:23:05.918 }, 00:23:05.918 "claimed": true, 00:23:05.918 "claim_type": "exclusive_write", 00:23:05.918 "zoned": false, 00:23:05.918 "supported_io_types": { 00:23:05.918 "read": true, 00:23:05.918 "write": true, 00:23:05.918 "unmap": true, 00:23:05.918 "flush": true, 00:23:05.918 "reset": true, 00:23:05.918 "nvme_admin": false, 00:23:05.918 "nvme_io": false, 00:23:05.918 "nvme_io_md": false, 00:23:05.918 "write_zeroes": true, 00:23:05.918 "zcopy": true, 00:23:05.918 "get_zone_info": false, 00:23:05.918 "zone_management": false, 00:23:05.918 "zone_append": false, 00:23:05.918 "compare": false, 00:23:05.918 "compare_and_write": false, 00:23:05.918 "abort": true, 00:23:05.918 "seek_hole": false, 00:23:05.918 "seek_data": false, 00:23:05.918 "copy": true, 00:23:05.918 "nvme_iov_md": false 00:23:05.918 }, 00:23:05.918 "memory_domains": [ 00:23:05.918 { 00:23:05.918 "dma_device_id": "system", 00:23:05.918 "dma_device_type": 1 00:23:05.918 }, 00:23:05.918 { 00:23:05.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.918 "dma_device_type": 2 00:23:05.918 } 00:23:05.918 ], 00:23:05.918 "driver_specific": {} 00:23:05.918 } 00:23:05.918 ] 00:23:05.918 05:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:05.918 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:05.918 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:05.918 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:05.918 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:05.918 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:05.918 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:05.918 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:05.918 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:05.919 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:05.919 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:05.919 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:05.919 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:05.919 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.919 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:06.177 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.177 "name": "Existed_Raid", 00:23:06.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.177 "strip_size_kb": 0, 00:23:06.177 "state": "configuring", 00:23:06.177 "raid_level": "raid1", 00:23:06.177 "superblock": false, 00:23:06.177 "num_base_bdevs": 4, 00:23:06.177 "num_base_bdevs_discovered": 2, 00:23:06.177 "num_base_bdevs_operational": 4, 00:23:06.177 "base_bdevs_list": [ 00:23:06.177 { 00:23:06.177 "name": "BaseBdev1", 00:23:06.177 "uuid": "611d158a-6cf2-4b9b-be21-18da70d544f0", 00:23:06.177 "is_configured": true, 00:23:06.177 "data_offset": 0, 00:23:06.177 "data_size": 65536 00:23:06.177 }, 00:23:06.177 { 00:23:06.177 "name": "BaseBdev2", 00:23:06.177 "uuid": "ee1d5154-3a51-4b26-b6b1-efea8e86b386", 00:23:06.177 "is_configured": true, 00:23:06.177 "data_offset": 0, 00:23:06.177 "data_size": 65536 00:23:06.177 }, 00:23:06.177 { 00:23:06.177 "name": "BaseBdev3", 00:23:06.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.177 "is_configured": false, 00:23:06.177 "data_offset": 0, 00:23:06.177 "data_size": 0 00:23:06.177 }, 00:23:06.177 { 00:23:06.177 "name": "BaseBdev4", 00:23:06.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:06.177 "is_configured": false, 00:23:06.177 "data_offset": 0, 00:23:06.177 "data_size": 0 00:23:06.177 } 00:23:06.177 ] 00:23:06.177 }' 00:23:06.177 05:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.177 05:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:06.744 05:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:07.003 [2024-07-26 05:51:21.813087] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:07.003 BaseBdev3 00:23:07.003 05:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:23:07.003 05:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:23:07.003 05:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:07.003 05:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:07.003 05:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:07.003 05:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:07.003 05:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:07.261 05:51:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:07.519 [ 00:23:07.519 { 00:23:07.519 "name": "BaseBdev3", 00:23:07.519 "aliases": [ 00:23:07.519 "2f62ae86-bf8b-4199-bd52-7967ee5f35a2" 00:23:07.519 ], 00:23:07.519 "product_name": "Malloc disk", 00:23:07.519 "block_size": 512, 00:23:07.519 "num_blocks": 65536, 00:23:07.519 "uuid": "2f62ae86-bf8b-4199-bd52-7967ee5f35a2", 00:23:07.519 "assigned_rate_limits": { 00:23:07.519 "rw_ios_per_sec": 0, 00:23:07.519 "rw_mbytes_per_sec": 0, 00:23:07.519 "r_mbytes_per_sec": 0, 00:23:07.519 "w_mbytes_per_sec": 0 00:23:07.519 }, 00:23:07.519 "claimed": true, 00:23:07.519 "claim_type": "exclusive_write", 00:23:07.519 "zoned": false, 00:23:07.519 "supported_io_types": { 00:23:07.519 "read": true, 00:23:07.519 "write": true, 00:23:07.519 "unmap": true, 00:23:07.519 "flush": true, 00:23:07.519 "reset": true, 00:23:07.519 "nvme_admin": false, 00:23:07.519 "nvme_io": false, 00:23:07.519 "nvme_io_md": false, 00:23:07.519 "write_zeroes": true, 00:23:07.519 "zcopy": true, 00:23:07.519 "get_zone_info": false, 00:23:07.519 "zone_management": false, 00:23:07.519 "zone_append": false, 00:23:07.519 "compare": false, 00:23:07.519 "compare_and_write": false, 00:23:07.519 "abort": true, 00:23:07.519 "seek_hole": false, 00:23:07.519 "seek_data": false, 00:23:07.519 "copy": true, 00:23:07.519 "nvme_iov_md": false 00:23:07.519 }, 00:23:07.519 "memory_domains": [ 00:23:07.519 { 00:23:07.519 "dma_device_id": "system", 00:23:07.519 "dma_device_type": 1 00:23:07.519 }, 00:23:07.519 { 00:23:07.519 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.519 "dma_device_type": 2 00:23:07.519 } 00:23:07.519 ], 00:23:07.519 "driver_specific": {} 00:23:07.519 } 00:23:07.519 ] 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:07.519 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.778 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:07.778 "name": "Existed_Raid", 00:23:07.778 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:07.778 "strip_size_kb": 0, 00:23:07.778 "state": "configuring", 00:23:07.778 "raid_level": "raid1", 00:23:07.778 "superblock": false, 00:23:07.778 "num_base_bdevs": 4, 00:23:07.778 "num_base_bdevs_discovered": 3, 00:23:07.778 "num_base_bdevs_operational": 4, 00:23:07.778 "base_bdevs_list": [ 00:23:07.778 { 00:23:07.778 "name": "BaseBdev1", 00:23:07.778 "uuid": "611d158a-6cf2-4b9b-be21-18da70d544f0", 00:23:07.778 "is_configured": true, 00:23:07.778 "data_offset": 0, 00:23:07.778 "data_size": 65536 00:23:07.778 }, 00:23:07.778 { 00:23:07.778 "name": "BaseBdev2", 00:23:07.778 "uuid": "ee1d5154-3a51-4b26-b6b1-efea8e86b386", 00:23:07.778 "is_configured": true, 00:23:07.778 "data_offset": 0, 00:23:07.778 "data_size": 65536 00:23:07.778 }, 00:23:07.778 { 00:23:07.778 "name": "BaseBdev3", 00:23:07.778 "uuid": "2f62ae86-bf8b-4199-bd52-7967ee5f35a2", 00:23:07.778 "is_configured": true, 00:23:07.778 "data_offset": 0, 00:23:07.778 "data_size": 65536 00:23:07.778 }, 00:23:07.778 { 00:23:07.778 "name": "BaseBdev4", 00:23:07.778 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:07.778 "is_configured": false, 00:23:07.778 "data_offset": 0, 00:23:07.778 "data_size": 0 00:23:07.778 } 00:23:07.778 ] 00:23:07.778 }' 00:23:07.778 05:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:07.778 05:51:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:08.344 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:08.614 [2024-07-26 05:51:23.284295] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:08.614 [2024-07-26 05:51:23.284332] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2494350 00:23:08.614 [2024-07-26 05:51:23.284341] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:08.614 [2024-07-26 05:51:23.284587] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2494020 00:23:08.614 [2024-07-26 05:51:23.284722] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2494350 00:23:08.614 [2024-07-26 05:51:23.284733] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2494350 00:23:08.614 [2024-07-26 05:51:23.284902] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.614 BaseBdev4 00:23:08.614 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:23:08.614 05:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:23:08.614 05:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:08.614 05:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:08.614 05:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:08.614 05:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:08.614 05:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:08.614 05:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:08.900 [ 00:23:08.900 { 00:23:08.900 "name": "BaseBdev4", 00:23:08.900 "aliases": [ 00:23:08.900 "b28cea87-151a-40fe-b978-6d6bcf843702" 00:23:08.900 ], 00:23:08.900 "product_name": "Malloc disk", 00:23:08.900 "block_size": 512, 00:23:08.900 "num_blocks": 65536, 00:23:08.900 "uuid": "b28cea87-151a-40fe-b978-6d6bcf843702", 00:23:08.900 "assigned_rate_limits": { 00:23:08.900 "rw_ios_per_sec": 0, 00:23:08.900 "rw_mbytes_per_sec": 0, 00:23:08.900 "r_mbytes_per_sec": 0, 00:23:08.900 "w_mbytes_per_sec": 0 00:23:08.900 }, 00:23:08.900 "claimed": true, 00:23:08.900 "claim_type": "exclusive_write", 00:23:08.900 "zoned": false, 00:23:08.900 "supported_io_types": { 00:23:08.900 "read": true, 00:23:08.900 "write": true, 00:23:08.900 "unmap": true, 00:23:08.900 "flush": true, 00:23:08.900 "reset": true, 00:23:08.900 "nvme_admin": false, 00:23:08.900 "nvme_io": false, 00:23:08.900 "nvme_io_md": false, 00:23:08.900 "write_zeroes": true, 00:23:08.900 "zcopy": true, 00:23:08.901 "get_zone_info": false, 00:23:08.901 "zone_management": false, 00:23:08.901 "zone_append": false, 00:23:08.901 "compare": false, 00:23:08.901 "compare_and_write": false, 00:23:08.901 "abort": true, 00:23:08.901 "seek_hole": false, 00:23:08.901 "seek_data": false, 00:23:08.901 "copy": true, 00:23:08.901 "nvme_iov_md": false 00:23:08.901 }, 00:23:08.901 "memory_domains": [ 00:23:08.901 { 00:23:08.901 "dma_device_id": "system", 00:23:08.901 "dma_device_type": 1 00:23:08.901 }, 00:23:08.901 { 00:23:08.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:08.901 "dma_device_type": 2 00:23:08.901 } 00:23:08.901 ], 00:23:08.901 "driver_specific": {} 00:23:08.901 } 00:23:08.901 ] 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:08.901 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.159 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.159 "name": "Existed_Raid", 00:23:09.159 "uuid": "0b645acf-0501-473b-bec3-e8fc9591343a", 00:23:09.159 "strip_size_kb": 0, 00:23:09.159 "state": "online", 00:23:09.159 "raid_level": "raid1", 00:23:09.159 "superblock": false, 00:23:09.159 "num_base_bdevs": 4, 00:23:09.159 "num_base_bdevs_discovered": 4, 00:23:09.159 "num_base_bdevs_operational": 4, 00:23:09.159 "base_bdevs_list": [ 00:23:09.159 { 00:23:09.159 "name": "BaseBdev1", 00:23:09.159 "uuid": "611d158a-6cf2-4b9b-be21-18da70d544f0", 00:23:09.159 "is_configured": true, 00:23:09.159 "data_offset": 0, 00:23:09.159 "data_size": 65536 00:23:09.159 }, 00:23:09.159 { 00:23:09.159 "name": "BaseBdev2", 00:23:09.159 "uuid": "ee1d5154-3a51-4b26-b6b1-efea8e86b386", 00:23:09.159 "is_configured": true, 00:23:09.159 "data_offset": 0, 00:23:09.159 "data_size": 65536 00:23:09.159 }, 00:23:09.159 { 00:23:09.159 "name": "BaseBdev3", 00:23:09.159 "uuid": "2f62ae86-bf8b-4199-bd52-7967ee5f35a2", 00:23:09.159 "is_configured": true, 00:23:09.159 "data_offset": 0, 00:23:09.159 "data_size": 65536 00:23:09.159 }, 00:23:09.159 { 00:23:09.159 "name": "BaseBdev4", 00:23:09.159 "uuid": "b28cea87-151a-40fe-b978-6d6bcf843702", 00:23:09.159 "is_configured": true, 00:23:09.159 "data_offset": 0, 00:23:09.159 "data_size": 65536 00:23:09.159 } 00:23:09.159 ] 00:23:09.159 }' 00:23:09.159 05:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.159 05:51:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:09.725 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:09.725 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:09.725 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:09.725 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:09.725 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:09.725 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:09.725 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:09.725 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:09.984 [2024-07-26 05:51:24.664257] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:09.984 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:09.984 "name": "Existed_Raid", 00:23:09.984 "aliases": [ 00:23:09.984 "0b645acf-0501-473b-bec3-e8fc9591343a" 00:23:09.984 ], 00:23:09.984 "product_name": "Raid Volume", 00:23:09.984 "block_size": 512, 00:23:09.984 "num_blocks": 65536, 00:23:09.984 "uuid": "0b645acf-0501-473b-bec3-e8fc9591343a", 00:23:09.984 "assigned_rate_limits": { 00:23:09.984 "rw_ios_per_sec": 0, 00:23:09.984 "rw_mbytes_per_sec": 0, 00:23:09.984 "r_mbytes_per_sec": 0, 00:23:09.984 "w_mbytes_per_sec": 0 00:23:09.984 }, 00:23:09.984 "claimed": false, 00:23:09.984 "zoned": false, 00:23:09.984 "supported_io_types": { 00:23:09.984 "read": true, 00:23:09.984 "write": true, 00:23:09.984 "unmap": false, 00:23:09.984 "flush": false, 00:23:09.984 "reset": true, 00:23:09.984 "nvme_admin": false, 00:23:09.984 "nvme_io": false, 00:23:09.984 "nvme_io_md": false, 00:23:09.984 "write_zeroes": true, 00:23:09.984 "zcopy": false, 00:23:09.984 "get_zone_info": false, 00:23:09.984 "zone_management": false, 00:23:09.984 "zone_append": false, 00:23:09.984 "compare": false, 00:23:09.984 "compare_and_write": false, 00:23:09.984 "abort": false, 00:23:09.984 "seek_hole": false, 00:23:09.984 "seek_data": false, 00:23:09.984 "copy": false, 00:23:09.984 "nvme_iov_md": false 00:23:09.984 }, 00:23:09.984 "memory_domains": [ 00:23:09.984 { 00:23:09.984 "dma_device_id": "system", 00:23:09.984 "dma_device_type": 1 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:09.984 "dma_device_type": 2 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "dma_device_id": "system", 00:23:09.984 "dma_device_type": 1 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:09.984 "dma_device_type": 2 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "dma_device_id": "system", 00:23:09.984 "dma_device_type": 1 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:09.984 "dma_device_type": 2 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "dma_device_id": "system", 00:23:09.984 "dma_device_type": 1 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:09.984 "dma_device_type": 2 00:23:09.984 } 00:23:09.984 ], 00:23:09.984 "driver_specific": { 00:23:09.984 "raid": { 00:23:09.984 "uuid": "0b645acf-0501-473b-bec3-e8fc9591343a", 00:23:09.984 "strip_size_kb": 0, 00:23:09.984 "state": "online", 00:23:09.984 "raid_level": "raid1", 00:23:09.984 "superblock": false, 00:23:09.984 "num_base_bdevs": 4, 00:23:09.984 "num_base_bdevs_discovered": 4, 00:23:09.984 "num_base_bdevs_operational": 4, 00:23:09.984 "base_bdevs_list": [ 00:23:09.984 { 00:23:09.984 "name": "BaseBdev1", 00:23:09.984 "uuid": "611d158a-6cf2-4b9b-be21-18da70d544f0", 00:23:09.984 "is_configured": true, 00:23:09.984 "data_offset": 0, 00:23:09.984 "data_size": 65536 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "name": "BaseBdev2", 00:23:09.984 "uuid": "ee1d5154-3a51-4b26-b6b1-efea8e86b386", 00:23:09.984 "is_configured": true, 00:23:09.984 "data_offset": 0, 00:23:09.984 "data_size": 65536 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "name": "BaseBdev3", 00:23:09.984 "uuid": "2f62ae86-bf8b-4199-bd52-7967ee5f35a2", 00:23:09.984 "is_configured": true, 00:23:09.984 "data_offset": 0, 00:23:09.984 "data_size": 65536 00:23:09.984 }, 00:23:09.984 { 00:23:09.984 "name": "BaseBdev4", 00:23:09.984 "uuid": "b28cea87-151a-40fe-b978-6d6bcf843702", 00:23:09.984 "is_configured": true, 00:23:09.984 "data_offset": 0, 00:23:09.984 "data_size": 65536 00:23:09.984 } 00:23:09.985 ] 00:23:09.985 } 00:23:09.985 } 00:23:09.985 }' 00:23:09.985 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:09.985 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:09.985 BaseBdev2 00:23:09.985 BaseBdev3 00:23:09.985 BaseBdev4' 00:23:09.985 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:09.985 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:09.985 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:10.243 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:10.243 "name": "BaseBdev1", 00:23:10.243 "aliases": [ 00:23:10.243 "611d158a-6cf2-4b9b-be21-18da70d544f0" 00:23:10.243 ], 00:23:10.243 "product_name": "Malloc disk", 00:23:10.243 "block_size": 512, 00:23:10.243 "num_blocks": 65536, 00:23:10.243 "uuid": "611d158a-6cf2-4b9b-be21-18da70d544f0", 00:23:10.243 "assigned_rate_limits": { 00:23:10.243 "rw_ios_per_sec": 0, 00:23:10.243 "rw_mbytes_per_sec": 0, 00:23:10.243 "r_mbytes_per_sec": 0, 00:23:10.243 "w_mbytes_per_sec": 0 00:23:10.243 }, 00:23:10.243 "claimed": true, 00:23:10.243 "claim_type": "exclusive_write", 00:23:10.243 "zoned": false, 00:23:10.243 "supported_io_types": { 00:23:10.243 "read": true, 00:23:10.243 "write": true, 00:23:10.243 "unmap": true, 00:23:10.243 "flush": true, 00:23:10.243 "reset": true, 00:23:10.243 "nvme_admin": false, 00:23:10.243 "nvme_io": false, 00:23:10.243 "nvme_io_md": false, 00:23:10.243 "write_zeroes": true, 00:23:10.243 "zcopy": true, 00:23:10.243 "get_zone_info": false, 00:23:10.243 "zone_management": false, 00:23:10.243 "zone_append": false, 00:23:10.243 "compare": false, 00:23:10.243 "compare_and_write": false, 00:23:10.243 "abort": true, 00:23:10.243 "seek_hole": false, 00:23:10.243 "seek_data": false, 00:23:10.243 "copy": true, 00:23:10.243 "nvme_iov_md": false 00:23:10.243 }, 00:23:10.243 "memory_domains": [ 00:23:10.243 { 00:23:10.243 "dma_device_id": "system", 00:23:10.243 "dma_device_type": 1 00:23:10.243 }, 00:23:10.243 { 00:23:10.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:10.243 "dma_device_type": 2 00:23:10.243 } 00:23:10.243 ], 00:23:10.243 "driver_specific": {} 00:23:10.243 }' 00:23:10.243 05:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:10.243 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:10.243 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:10.243 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:10.243 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:10.243 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:10.243 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:10.501 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:10.501 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:10.501 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:10.501 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:10.501 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:10.501 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:10.501 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:10.501 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:10.759 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:10.759 "name": "BaseBdev2", 00:23:10.759 "aliases": [ 00:23:10.759 "ee1d5154-3a51-4b26-b6b1-efea8e86b386" 00:23:10.759 ], 00:23:10.759 "product_name": "Malloc disk", 00:23:10.759 "block_size": 512, 00:23:10.759 "num_blocks": 65536, 00:23:10.759 "uuid": "ee1d5154-3a51-4b26-b6b1-efea8e86b386", 00:23:10.759 "assigned_rate_limits": { 00:23:10.759 "rw_ios_per_sec": 0, 00:23:10.759 "rw_mbytes_per_sec": 0, 00:23:10.759 "r_mbytes_per_sec": 0, 00:23:10.759 "w_mbytes_per_sec": 0 00:23:10.759 }, 00:23:10.759 "claimed": true, 00:23:10.759 "claim_type": "exclusive_write", 00:23:10.759 "zoned": false, 00:23:10.759 "supported_io_types": { 00:23:10.759 "read": true, 00:23:10.759 "write": true, 00:23:10.759 "unmap": true, 00:23:10.759 "flush": true, 00:23:10.759 "reset": true, 00:23:10.759 "nvme_admin": false, 00:23:10.759 "nvme_io": false, 00:23:10.759 "nvme_io_md": false, 00:23:10.759 "write_zeroes": true, 00:23:10.759 "zcopy": true, 00:23:10.759 "get_zone_info": false, 00:23:10.759 "zone_management": false, 00:23:10.759 "zone_append": false, 00:23:10.759 "compare": false, 00:23:10.759 "compare_and_write": false, 00:23:10.759 "abort": true, 00:23:10.759 "seek_hole": false, 00:23:10.759 "seek_data": false, 00:23:10.759 "copy": true, 00:23:10.759 "nvme_iov_md": false 00:23:10.759 }, 00:23:10.759 "memory_domains": [ 00:23:10.759 { 00:23:10.759 "dma_device_id": "system", 00:23:10.759 "dma_device_type": 1 00:23:10.759 }, 00:23:10.759 { 00:23:10.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:10.759 "dma_device_type": 2 00:23:10.759 } 00:23:10.759 ], 00:23:10.759 "driver_specific": {} 00:23:10.759 }' 00:23:10.759 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:10.759 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:11.017 05:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:11.276 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:11.276 "name": "BaseBdev3", 00:23:11.276 "aliases": [ 00:23:11.276 "2f62ae86-bf8b-4199-bd52-7967ee5f35a2" 00:23:11.276 ], 00:23:11.276 "product_name": "Malloc disk", 00:23:11.276 "block_size": 512, 00:23:11.276 "num_blocks": 65536, 00:23:11.276 "uuid": "2f62ae86-bf8b-4199-bd52-7967ee5f35a2", 00:23:11.276 "assigned_rate_limits": { 00:23:11.276 "rw_ios_per_sec": 0, 00:23:11.276 "rw_mbytes_per_sec": 0, 00:23:11.276 "r_mbytes_per_sec": 0, 00:23:11.276 "w_mbytes_per_sec": 0 00:23:11.276 }, 00:23:11.276 "claimed": true, 00:23:11.276 "claim_type": "exclusive_write", 00:23:11.276 "zoned": false, 00:23:11.276 "supported_io_types": { 00:23:11.276 "read": true, 00:23:11.276 "write": true, 00:23:11.276 "unmap": true, 00:23:11.276 "flush": true, 00:23:11.276 "reset": true, 00:23:11.276 "nvme_admin": false, 00:23:11.276 "nvme_io": false, 00:23:11.276 "nvme_io_md": false, 00:23:11.276 "write_zeroes": true, 00:23:11.276 "zcopy": true, 00:23:11.276 "get_zone_info": false, 00:23:11.276 "zone_management": false, 00:23:11.276 "zone_append": false, 00:23:11.276 "compare": false, 00:23:11.276 "compare_and_write": false, 00:23:11.276 "abort": true, 00:23:11.276 "seek_hole": false, 00:23:11.276 "seek_data": false, 00:23:11.276 "copy": true, 00:23:11.276 "nvme_iov_md": false 00:23:11.276 }, 00:23:11.276 "memory_domains": [ 00:23:11.276 { 00:23:11.276 "dma_device_id": "system", 00:23:11.276 "dma_device_type": 1 00:23:11.276 }, 00:23:11.276 { 00:23:11.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:11.276 "dma_device_type": 2 00:23:11.276 } 00:23:11.276 ], 00:23:11.276 "driver_specific": {} 00:23:11.276 }' 00:23:11.276 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:11.534 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:11.534 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:11.534 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:11.534 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:11.534 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:11.534 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:11.534 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:11.534 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:11.534 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.792 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:11.792 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:11.792 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:11.792 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:11.792 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:12.050 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:12.050 "name": "BaseBdev4", 00:23:12.050 "aliases": [ 00:23:12.050 "b28cea87-151a-40fe-b978-6d6bcf843702" 00:23:12.050 ], 00:23:12.050 "product_name": "Malloc disk", 00:23:12.050 "block_size": 512, 00:23:12.050 "num_blocks": 65536, 00:23:12.050 "uuid": "b28cea87-151a-40fe-b978-6d6bcf843702", 00:23:12.050 "assigned_rate_limits": { 00:23:12.050 "rw_ios_per_sec": 0, 00:23:12.050 "rw_mbytes_per_sec": 0, 00:23:12.050 "r_mbytes_per_sec": 0, 00:23:12.050 "w_mbytes_per_sec": 0 00:23:12.050 }, 00:23:12.050 "claimed": true, 00:23:12.050 "claim_type": "exclusive_write", 00:23:12.050 "zoned": false, 00:23:12.050 "supported_io_types": { 00:23:12.050 "read": true, 00:23:12.050 "write": true, 00:23:12.050 "unmap": true, 00:23:12.051 "flush": true, 00:23:12.051 "reset": true, 00:23:12.051 "nvme_admin": false, 00:23:12.051 "nvme_io": false, 00:23:12.051 "nvme_io_md": false, 00:23:12.051 "write_zeroes": true, 00:23:12.051 "zcopy": true, 00:23:12.051 "get_zone_info": false, 00:23:12.051 "zone_management": false, 00:23:12.051 "zone_append": false, 00:23:12.051 "compare": false, 00:23:12.051 "compare_and_write": false, 00:23:12.051 "abort": true, 00:23:12.051 "seek_hole": false, 00:23:12.051 "seek_data": false, 00:23:12.051 "copy": true, 00:23:12.051 "nvme_iov_md": false 00:23:12.051 }, 00:23:12.051 "memory_domains": [ 00:23:12.051 { 00:23:12.051 "dma_device_id": "system", 00:23:12.051 "dma_device_type": 1 00:23:12.051 }, 00:23:12.051 { 00:23:12.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:12.051 "dma_device_type": 2 00:23:12.051 } 00:23:12.051 ], 00:23:12.051 "driver_specific": {} 00:23:12.051 }' 00:23:12.051 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:12.051 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:12.051 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:12.051 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:12.051 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:12.051 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:12.051 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:12.309 05:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:12.309 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:12.309 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:12.309 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:12.309 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:12.309 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:12.566 [2024-07-26 05:51:27.359100] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.566 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:12.824 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:12.824 "name": "Existed_Raid", 00:23:12.824 "uuid": "0b645acf-0501-473b-bec3-e8fc9591343a", 00:23:12.824 "strip_size_kb": 0, 00:23:12.824 "state": "online", 00:23:12.824 "raid_level": "raid1", 00:23:12.824 "superblock": false, 00:23:12.824 "num_base_bdevs": 4, 00:23:12.824 "num_base_bdevs_discovered": 3, 00:23:12.824 "num_base_bdevs_operational": 3, 00:23:12.824 "base_bdevs_list": [ 00:23:12.824 { 00:23:12.824 "name": null, 00:23:12.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.824 "is_configured": false, 00:23:12.824 "data_offset": 0, 00:23:12.824 "data_size": 65536 00:23:12.824 }, 00:23:12.824 { 00:23:12.824 "name": "BaseBdev2", 00:23:12.824 "uuid": "ee1d5154-3a51-4b26-b6b1-efea8e86b386", 00:23:12.824 "is_configured": true, 00:23:12.824 "data_offset": 0, 00:23:12.824 "data_size": 65536 00:23:12.824 }, 00:23:12.824 { 00:23:12.824 "name": "BaseBdev3", 00:23:12.824 "uuid": "2f62ae86-bf8b-4199-bd52-7967ee5f35a2", 00:23:12.824 "is_configured": true, 00:23:12.824 "data_offset": 0, 00:23:12.824 "data_size": 65536 00:23:12.824 }, 00:23:12.824 { 00:23:12.824 "name": "BaseBdev4", 00:23:12.824 "uuid": "b28cea87-151a-40fe-b978-6d6bcf843702", 00:23:12.824 "is_configured": true, 00:23:12.824 "data_offset": 0, 00:23:12.824 "data_size": 65536 00:23:12.824 } 00:23:12.824 ] 00:23:12.824 }' 00:23:12.824 05:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:12.824 05:51:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:13.387 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:13.387 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:13.388 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.388 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:13.644 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:13.644 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:13.644 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:13.901 [2024-07-26 05:51:28.699615] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:13.901 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:13.901 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:13.901 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.901 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:14.158 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:14.158 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:14.158 05:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:23:14.416 [2024-07-26 05:51:29.211449] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:14.416 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:14.416 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:14.416 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.416 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:14.674 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:14.674 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:14.674 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:23:14.932 [2024-07-26 05:51:29.715365] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:23:14.932 [2024-07-26 05:51:29.715449] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:14.932 [2024-07-26 05:51:29.728020] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:14.932 [2024-07-26 05:51:29.728057] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:14.932 [2024-07-26 05:51:29.728068] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2494350 name Existed_Raid, state offline 00:23:14.932 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:14.932 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:14.932 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.932 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:15.198 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:15.198 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:15.198 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:23:15.198 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:23:15.198 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:15.198 05:51:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:15.455 BaseBdev2 00:23:15.455 05:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:23:15.455 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:23:15.455 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:15.455 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:15.455 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:15.455 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:15.455 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:15.713 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:15.979 [ 00:23:15.980 { 00:23:15.980 "name": "BaseBdev2", 00:23:15.980 "aliases": [ 00:23:15.980 "de51093a-fc0c-47ab-b686-0fc79e75ef6e" 00:23:15.980 ], 00:23:15.980 "product_name": "Malloc disk", 00:23:15.980 "block_size": 512, 00:23:15.980 "num_blocks": 65536, 00:23:15.980 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:15.980 "assigned_rate_limits": { 00:23:15.980 "rw_ios_per_sec": 0, 00:23:15.980 "rw_mbytes_per_sec": 0, 00:23:15.980 "r_mbytes_per_sec": 0, 00:23:15.980 "w_mbytes_per_sec": 0 00:23:15.980 }, 00:23:15.980 "claimed": false, 00:23:15.980 "zoned": false, 00:23:15.980 "supported_io_types": { 00:23:15.980 "read": true, 00:23:15.980 "write": true, 00:23:15.980 "unmap": true, 00:23:15.980 "flush": true, 00:23:15.980 "reset": true, 00:23:15.980 "nvme_admin": false, 00:23:15.980 "nvme_io": false, 00:23:15.980 "nvme_io_md": false, 00:23:15.980 "write_zeroes": true, 00:23:15.980 "zcopy": true, 00:23:15.980 "get_zone_info": false, 00:23:15.980 "zone_management": false, 00:23:15.980 "zone_append": false, 00:23:15.980 "compare": false, 00:23:15.980 "compare_and_write": false, 00:23:15.980 "abort": true, 00:23:15.980 "seek_hole": false, 00:23:15.980 "seek_data": false, 00:23:15.980 "copy": true, 00:23:15.980 "nvme_iov_md": false 00:23:15.980 }, 00:23:15.980 "memory_domains": [ 00:23:15.980 { 00:23:15.980 "dma_device_id": "system", 00:23:15.980 "dma_device_type": 1 00:23:15.980 }, 00:23:15.980 { 00:23:15.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:15.980 "dma_device_type": 2 00:23:15.980 } 00:23:15.980 ], 00:23:15.980 "driver_specific": {} 00:23:15.980 } 00:23:15.980 ] 00:23:15.980 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:15.980 05:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:15.980 05:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:15.980 05:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:16.240 BaseBdev3 00:23:16.240 05:51:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:23:16.240 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:23:16.240 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:16.240 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:16.240 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:16.240 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:16.240 05:51:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:16.499 05:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:16.757 [ 00:23:16.757 { 00:23:16.757 "name": "BaseBdev3", 00:23:16.757 "aliases": [ 00:23:16.757 "9a9152aa-138b-4ece-837f-f7f498ae5324" 00:23:16.757 ], 00:23:16.757 "product_name": "Malloc disk", 00:23:16.757 "block_size": 512, 00:23:16.757 "num_blocks": 65536, 00:23:16.757 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:16.757 "assigned_rate_limits": { 00:23:16.757 "rw_ios_per_sec": 0, 00:23:16.757 "rw_mbytes_per_sec": 0, 00:23:16.757 "r_mbytes_per_sec": 0, 00:23:16.757 "w_mbytes_per_sec": 0 00:23:16.757 }, 00:23:16.757 "claimed": false, 00:23:16.757 "zoned": false, 00:23:16.757 "supported_io_types": { 00:23:16.757 "read": true, 00:23:16.757 "write": true, 00:23:16.757 "unmap": true, 00:23:16.757 "flush": true, 00:23:16.757 "reset": true, 00:23:16.757 "nvme_admin": false, 00:23:16.757 "nvme_io": false, 00:23:16.757 "nvme_io_md": false, 00:23:16.757 "write_zeroes": true, 00:23:16.757 "zcopy": true, 00:23:16.757 "get_zone_info": false, 00:23:16.757 "zone_management": false, 00:23:16.757 "zone_append": false, 00:23:16.757 "compare": false, 00:23:16.757 "compare_and_write": false, 00:23:16.757 "abort": true, 00:23:16.757 "seek_hole": false, 00:23:16.757 "seek_data": false, 00:23:16.757 "copy": true, 00:23:16.757 "nvme_iov_md": false 00:23:16.757 }, 00:23:16.757 "memory_domains": [ 00:23:16.757 { 00:23:16.757 "dma_device_id": "system", 00:23:16.757 "dma_device_type": 1 00:23:16.757 }, 00:23:16.757 { 00:23:16.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:16.757 "dma_device_type": 2 00:23:16.757 } 00:23:16.757 ], 00:23:16.757 "driver_specific": {} 00:23:16.757 } 00:23:16.757 ] 00:23:16.757 05:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:16.757 05:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:16.757 05:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:16.757 05:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:17.015 BaseBdev4 00:23:17.015 05:51:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:23:17.015 05:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:23:17.015 05:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:17.015 05:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:17.015 05:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:17.015 05:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:17.015 05:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:17.273 05:51:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:17.532 [ 00:23:17.532 { 00:23:17.532 "name": "BaseBdev4", 00:23:17.532 "aliases": [ 00:23:17.532 "6adade06-315a-4f5c-a6de-7af062a327d5" 00:23:17.532 ], 00:23:17.532 "product_name": "Malloc disk", 00:23:17.532 "block_size": 512, 00:23:17.532 "num_blocks": 65536, 00:23:17.532 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:17.532 "assigned_rate_limits": { 00:23:17.532 "rw_ios_per_sec": 0, 00:23:17.532 "rw_mbytes_per_sec": 0, 00:23:17.532 "r_mbytes_per_sec": 0, 00:23:17.532 "w_mbytes_per_sec": 0 00:23:17.532 }, 00:23:17.532 "claimed": false, 00:23:17.532 "zoned": false, 00:23:17.532 "supported_io_types": { 00:23:17.532 "read": true, 00:23:17.532 "write": true, 00:23:17.532 "unmap": true, 00:23:17.532 "flush": true, 00:23:17.532 "reset": true, 00:23:17.532 "nvme_admin": false, 00:23:17.532 "nvme_io": false, 00:23:17.532 "nvme_io_md": false, 00:23:17.532 "write_zeroes": true, 00:23:17.532 "zcopy": true, 00:23:17.532 "get_zone_info": false, 00:23:17.532 "zone_management": false, 00:23:17.532 "zone_append": false, 00:23:17.532 "compare": false, 00:23:17.532 "compare_and_write": false, 00:23:17.532 "abort": true, 00:23:17.532 "seek_hole": false, 00:23:17.532 "seek_data": false, 00:23:17.532 "copy": true, 00:23:17.532 "nvme_iov_md": false 00:23:17.532 }, 00:23:17.532 "memory_domains": [ 00:23:17.532 { 00:23:17.532 "dma_device_id": "system", 00:23:17.532 "dma_device_type": 1 00:23:17.532 }, 00:23:17.532 { 00:23:17.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:17.532 "dma_device_type": 2 00:23:17.532 } 00:23:17.532 ], 00:23:17.532 "driver_specific": {} 00:23:17.532 } 00:23:17.532 ] 00:23:17.532 05:51:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:17.532 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:17.532 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:17.532 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:17.532 [2024-07-26 05:51:32.433363] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:17.532 [2024-07-26 05:51:32.433404] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:17.532 [2024-07-26 05:51:32.433423] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:17.532 [2024-07-26 05:51:32.434777] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:17.532 [2024-07-26 05:51:32.434817] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.790 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:18.048 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.048 "name": "Existed_Raid", 00:23:18.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.048 "strip_size_kb": 0, 00:23:18.048 "state": "configuring", 00:23:18.048 "raid_level": "raid1", 00:23:18.048 "superblock": false, 00:23:18.048 "num_base_bdevs": 4, 00:23:18.048 "num_base_bdevs_discovered": 3, 00:23:18.048 "num_base_bdevs_operational": 4, 00:23:18.048 "base_bdevs_list": [ 00:23:18.048 { 00:23:18.048 "name": "BaseBdev1", 00:23:18.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.048 "is_configured": false, 00:23:18.048 "data_offset": 0, 00:23:18.048 "data_size": 0 00:23:18.048 }, 00:23:18.048 { 00:23:18.048 "name": "BaseBdev2", 00:23:18.048 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:18.048 "is_configured": true, 00:23:18.048 "data_offset": 0, 00:23:18.048 "data_size": 65536 00:23:18.048 }, 00:23:18.048 { 00:23:18.048 "name": "BaseBdev3", 00:23:18.048 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:18.048 "is_configured": true, 00:23:18.048 "data_offset": 0, 00:23:18.048 "data_size": 65536 00:23:18.048 }, 00:23:18.048 { 00:23:18.048 "name": "BaseBdev4", 00:23:18.048 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:18.048 "is_configured": true, 00:23:18.048 "data_offset": 0, 00:23:18.048 "data_size": 65536 00:23:18.048 } 00:23:18.048 ] 00:23:18.048 }' 00:23:18.048 05:51:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.048 05:51:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:18.613 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:18.871 [2024-07-26 05:51:33.524211] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.871 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:19.130 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.130 "name": "Existed_Raid", 00:23:19.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.130 "strip_size_kb": 0, 00:23:19.130 "state": "configuring", 00:23:19.130 "raid_level": "raid1", 00:23:19.130 "superblock": false, 00:23:19.130 "num_base_bdevs": 4, 00:23:19.130 "num_base_bdevs_discovered": 2, 00:23:19.130 "num_base_bdevs_operational": 4, 00:23:19.130 "base_bdevs_list": [ 00:23:19.130 { 00:23:19.130 "name": "BaseBdev1", 00:23:19.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.130 "is_configured": false, 00:23:19.130 "data_offset": 0, 00:23:19.130 "data_size": 0 00:23:19.130 }, 00:23:19.130 { 00:23:19.130 "name": null, 00:23:19.130 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:19.130 "is_configured": false, 00:23:19.130 "data_offset": 0, 00:23:19.130 "data_size": 65536 00:23:19.130 }, 00:23:19.130 { 00:23:19.130 "name": "BaseBdev3", 00:23:19.130 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:19.130 "is_configured": true, 00:23:19.130 "data_offset": 0, 00:23:19.130 "data_size": 65536 00:23:19.130 }, 00:23:19.130 { 00:23:19.130 "name": "BaseBdev4", 00:23:19.130 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:19.130 "is_configured": true, 00:23:19.130 "data_offset": 0, 00:23:19.130 "data_size": 65536 00:23:19.130 } 00:23:19.130 ] 00:23:19.130 }' 00:23:19.130 05:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.130 05:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:19.696 05:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.696 05:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:19.954 05:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:23:19.954 05:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:20.212 [2024-07-26 05:51:34.875125] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:20.212 BaseBdev1 00:23:20.212 05:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:23:20.212 05:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:23:20.212 05:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:20.212 05:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:20.212 05:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:20.212 05:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:20.212 05:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:20.470 05:51:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:20.470 [ 00:23:20.470 { 00:23:20.470 "name": "BaseBdev1", 00:23:20.470 "aliases": [ 00:23:20.470 "f6a5117d-b359-4b6d-be23-0ea6130bc4ae" 00:23:20.470 ], 00:23:20.470 "product_name": "Malloc disk", 00:23:20.470 "block_size": 512, 00:23:20.470 "num_blocks": 65536, 00:23:20.470 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:20.470 "assigned_rate_limits": { 00:23:20.470 "rw_ios_per_sec": 0, 00:23:20.470 "rw_mbytes_per_sec": 0, 00:23:20.470 "r_mbytes_per_sec": 0, 00:23:20.470 "w_mbytes_per_sec": 0 00:23:20.470 }, 00:23:20.470 "claimed": true, 00:23:20.470 "claim_type": "exclusive_write", 00:23:20.470 "zoned": false, 00:23:20.470 "supported_io_types": { 00:23:20.470 "read": true, 00:23:20.470 "write": true, 00:23:20.470 "unmap": true, 00:23:20.470 "flush": true, 00:23:20.470 "reset": true, 00:23:20.470 "nvme_admin": false, 00:23:20.470 "nvme_io": false, 00:23:20.470 "nvme_io_md": false, 00:23:20.470 "write_zeroes": true, 00:23:20.470 "zcopy": true, 00:23:20.470 "get_zone_info": false, 00:23:20.470 "zone_management": false, 00:23:20.470 "zone_append": false, 00:23:20.470 "compare": false, 00:23:20.470 "compare_and_write": false, 00:23:20.470 "abort": true, 00:23:20.470 "seek_hole": false, 00:23:20.470 "seek_data": false, 00:23:20.470 "copy": true, 00:23:20.470 "nvme_iov_md": false 00:23:20.470 }, 00:23:20.470 "memory_domains": [ 00:23:20.470 { 00:23:20.470 "dma_device_id": "system", 00:23:20.470 "dma_device_type": 1 00:23:20.470 }, 00:23:20.470 { 00:23:20.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:20.470 "dma_device_type": 2 00:23:20.470 } 00:23:20.470 ], 00:23:20.470 "driver_specific": {} 00:23:20.470 } 00:23:20.470 ] 00:23:20.470 05:51:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:20.729 "name": "Existed_Raid", 00:23:20.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:20.729 "strip_size_kb": 0, 00:23:20.729 "state": "configuring", 00:23:20.729 "raid_level": "raid1", 00:23:20.729 "superblock": false, 00:23:20.729 "num_base_bdevs": 4, 00:23:20.729 "num_base_bdevs_discovered": 3, 00:23:20.729 "num_base_bdevs_operational": 4, 00:23:20.729 "base_bdevs_list": [ 00:23:20.729 { 00:23:20.729 "name": "BaseBdev1", 00:23:20.729 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:20.729 "is_configured": true, 00:23:20.729 "data_offset": 0, 00:23:20.729 "data_size": 65536 00:23:20.729 }, 00:23:20.729 { 00:23:20.729 "name": null, 00:23:20.729 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:20.729 "is_configured": false, 00:23:20.729 "data_offset": 0, 00:23:20.729 "data_size": 65536 00:23:20.729 }, 00:23:20.729 { 00:23:20.729 "name": "BaseBdev3", 00:23:20.729 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:20.729 "is_configured": true, 00:23:20.729 "data_offset": 0, 00:23:20.729 "data_size": 65536 00:23:20.729 }, 00:23:20.729 { 00:23:20.729 "name": "BaseBdev4", 00:23:20.729 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:20.729 "is_configured": true, 00:23:20.729 "data_offset": 0, 00:23:20.729 "data_size": 65536 00:23:20.729 } 00:23:20.729 ] 00:23:20.729 }' 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:20.729 05:51:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:21.664 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.664 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:21.664 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:23:21.664 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:23:21.923 [2024-07-26 05:51:36.703988] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.923 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:22.182 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:22.182 "name": "Existed_Raid", 00:23:22.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.182 "strip_size_kb": 0, 00:23:22.182 "state": "configuring", 00:23:22.182 "raid_level": "raid1", 00:23:22.182 "superblock": false, 00:23:22.182 "num_base_bdevs": 4, 00:23:22.182 "num_base_bdevs_discovered": 2, 00:23:22.182 "num_base_bdevs_operational": 4, 00:23:22.182 "base_bdevs_list": [ 00:23:22.182 { 00:23:22.182 "name": "BaseBdev1", 00:23:22.182 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:22.182 "is_configured": true, 00:23:22.182 "data_offset": 0, 00:23:22.182 "data_size": 65536 00:23:22.182 }, 00:23:22.182 { 00:23:22.182 "name": null, 00:23:22.182 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:22.182 "is_configured": false, 00:23:22.182 "data_offset": 0, 00:23:22.182 "data_size": 65536 00:23:22.182 }, 00:23:22.182 { 00:23:22.182 "name": null, 00:23:22.182 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:22.182 "is_configured": false, 00:23:22.182 "data_offset": 0, 00:23:22.182 "data_size": 65536 00:23:22.182 }, 00:23:22.182 { 00:23:22.182 "name": "BaseBdev4", 00:23:22.182 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:22.182 "is_configured": true, 00:23:22.182 "data_offset": 0, 00:23:22.182 "data_size": 65536 00:23:22.182 } 00:23:22.182 ] 00:23:22.182 }' 00:23:22.182 05:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:22.182 05:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:22.752 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:22.752 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.075 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:23:23.075 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:23:23.076 [2024-07-26 05:51:37.935288] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.076 05:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:23.334 05:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:23.334 "name": "Existed_Raid", 00:23:23.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:23.334 "strip_size_kb": 0, 00:23:23.334 "state": "configuring", 00:23:23.334 "raid_level": "raid1", 00:23:23.334 "superblock": false, 00:23:23.334 "num_base_bdevs": 4, 00:23:23.334 "num_base_bdevs_discovered": 3, 00:23:23.334 "num_base_bdevs_operational": 4, 00:23:23.334 "base_bdevs_list": [ 00:23:23.334 { 00:23:23.334 "name": "BaseBdev1", 00:23:23.334 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:23.334 "is_configured": true, 00:23:23.334 "data_offset": 0, 00:23:23.334 "data_size": 65536 00:23:23.334 }, 00:23:23.334 { 00:23:23.334 "name": null, 00:23:23.334 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:23.334 "is_configured": false, 00:23:23.334 "data_offset": 0, 00:23:23.334 "data_size": 65536 00:23:23.334 }, 00:23:23.334 { 00:23:23.334 "name": "BaseBdev3", 00:23:23.334 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:23.334 "is_configured": true, 00:23:23.334 "data_offset": 0, 00:23:23.334 "data_size": 65536 00:23:23.334 }, 00:23:23.334 { 00:23:23.334 "name": "BaseBdev4", 00:23:23.334 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:23.334 "is_configured": true, 00:23:23.334 "data_offset": 0, 00:23:23.334 "data_size": 65536 00:23:23.334 } 00:23:23.334 ] 00:23:23.334 }' 00:23:23.334 05:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:23.334 05:51:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:23.900 05:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.900 05:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:24.158 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:23:24.158 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:24.417 [2024-07-26 05:51:39.258818] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.417 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:24.676 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.676 "name": "Existed_Raid", 00:23:24.676 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:24.676 "strip_size_kb": 0, 00:23:24.676 "state": "configuring", 00:23:24.676 "raid_level": "raid1", 00:23:24.676 "superblock": false, 00:23:24.676 "num_base_bdevs": 4, 00:23:24.676 "num_base_bdevs_discovered": 2, 00:23:24.676 "num_base_bdevs_operational": 4, 00:23:24.676 "base_bdevs_list": [ 00:23:24.676 { 00:23:24.676 "name": null, 00:23:24.676 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:24.676 "is_configured": false, 00:23:24.676 "data_offset": 0, 00:23:24.676 "data_size": 65536 00:23:24.676 }, 00:23:24.676 { 00:23:24.676 "name": null, 00:23:24.676 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:24.676 "is_configured": false, 00:23:24.676 "data_offset": 0, 00:23:24.676 "data_size": 65536 00:23:24.676 }, 00:23:24.676 { 00:23:24.676 "name": "BaseBdev3", 00:23:24.676 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:24.676 "is_configured": true, 00:23:24.676 "data_offset": 0, 00:23:24.676 "data_size": 65536 00:23:24.676 }, 00:23:24.676 { 00:23:24.676 "name": "BaseBdev4", 00:23:24.676 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:24.676 "is_configured": true, 00:23:24.676 "data_offset": 0, 00:23:24.676 "data_size": 65536 00:23:24.676 } 00:23:24.676 ] 00:23:24.676 }' 00:23:24.676 05:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.676 05:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:25.242 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.242 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:25.499 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:23:25.499 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:23:25.758 [2024-07-26 05:51:40.558591] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.758 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:26.017 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.017 "name": "Existed_Raid", 00:23:26.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:26.017 "strip_size_kb": 0, 00:23:26.017 "state": "configuring", 00:23:26.017 "raid_level": "raid1", 00:23:26.017 "superblock": false, 00:23:26.017 "num_base_bdevs": 4, 00:23:26.017 "num_base_bdevs_discovered": 3, 00:23:26.017 "num_base_bdevs_operational": 4, 00:23:26.017 "base_bdevs_list": [ 00:23:26.017 { 00:23:26.017 "name": null, 00:23:26.017 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:26.017 "is_configured": false, 00:23:26.017 "data_offset": 0, 00:23:26.017 "data_size": 65536 00:23:26.017 }, 00:23:26.017 { 00:23:26.017 "name": "BaseBdev2", 00:23:26.017 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:26.017 "is_configured": true, 00:23:26.017 "data_offset": 0, 00:23:26.017 "data_size": 65536 00:23:26.017 }, 00:23:26.017 { 00:23:26.017 "name": "BaseBdev3", 00:23:26.017 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:26.017 "is_configured": true, 00:23:26.017 "data_offset": 0, 00:23:26.017 "data_size": 65536 00:23:26.017 }, 00:23:26.017 { 00:23:26.017 "name": "BaseBdev4", 00:23:26.017 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:26.017 "is_configured": true, 00:23:26.017 "data_offset": 0, 00:23:26.017 "data_size": 65536 00:23:26.017 } 00:23:26.017 ] 00:23:26.017 }' 00:23:26.017 05:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.017 05:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:26.586 05:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.586 05:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:26.844 05:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:23:26.844 05:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.844 05:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:23:27.103 05:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f6a5117d-b359-4b6d-be23-0ea6130bc4ae 00:23:27.362 [2024-07-26 05:51:42.134235] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:23:27.362 [2024-07-26 05:51:42.134277] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2492610 00:23:27.362 [2024-07-26 05:51:42.134291] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:27.362 [2024-07-26 05:51:42.134487] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2493a70 00:23:27.362 [2024-07-26 05:51:42.134610] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2492610 00:23:27.363 [2024-07-26 05:51:42.134620] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2492610 00:23:27.363 [2024-07-26 05:51:42.134801] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:27.363 NewBaseBdev 00:23:27.363 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:23:27.363 05:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:23:27.363 05:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:27.363 05:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:27.363 05:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:27.363 05:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:27.363 05:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:27.929 05:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:23:28.187 [ 00:23:28.188 { 00:23:28.188 "name": "NewBaseBdev", 00:23:28.188 "aliases": [ 00:23:28.188 "f6a5117d-b359-4b6d-be23-0ea6130bc4ae" 00:23:28.188 ], 00:23:28.188 "product_name": "Malloc disk", 00:23:28.188 "block_size": 512, 00:23:28.188 "num_blocks": 65536, 00:23:28.188 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:28.188 "assigned_rate_limits": { 00:23:28.188 "rw_ios_per_sec": 0, 00:23:28.188 "rw_mbytes_per_sec": 0, 00:23:28.188 "r_mbytes_per_sec": 0, 00:23:28.188 "w_mbytes_per_sec": 0 00:23:28.188 }, 00:23:28.188 "claimed": true, 00:23:28.188 "claim_type": "exclusive_write", 00:23:28.188 "zoned": false, 00:23:28.188 "supported_io_types": { 00:23:28.188 "read": true, 00:23:28.188 "write": true, 00:23:28.188 "unmap": true, 00:23:28.188 "flush": true, 00:23:28.188 "reset": true, 00:23:28.188 "nvme_admin": false, 00:23:28.188 "nvme_io": false, 00:23:28.188 "nvme_io_md": false, 00:23:28.188 "write_zeroes": true, 00:23:28.188 "zcopy": true, 00:23:28.188 "get_zone_info": false, 00:23:28.188 "zone_management": false, 00:23:28.188 "zone_append": false, 00:23:28.188 "compare": false, 00:23:28.188 "compare_and_write": false, 00:23:28.188 "abort": true, 00:23:28.188 "seek_hole": false, 00:23:28.188 "seek_data": false, 00:23:28.188 "copy": true, 00:23:28.188 "nvme_iov_md": false 00:23:28.188 }, 00:23:28.188 "memory_domains": [ 00:23:28.188 { 00:23:28.188 "dma_device_id": "system", 00:23:28.188 "dma_device_type": 1 00:23:28.188 }, 00:23:28.188 { 00:23:28.188 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:28.188 "dma_device_type": 2 00:23:28.188 } 00:23:28.188 ], 00:23:28.188 "driver_specific": {} 00:23:28.188 } 00:23:28.188 ] 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.188 05:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:28.754 05:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:28.754 "name": "Existed_Raid", 00:23:28.754 "uuid": "5e81f8c4-6b19-40b3-8a73-ff43cce06ae0", 00:23:28.754 "strip_size_kb": 0, 00:23:28.754 "state": "online", 00:23:28.754 "raid_level": "raid1", 00:23:28.754 "superblock": false, 00:23:28.754 "num_base_bdevs": 4, 00:23:28.754 "num_base_bdevs_discovered": 4, 00:23:28.754 "num_base_bdevs_operational": 4, 00:23:28.754 "base_bdevs_list": [ 00:23:28.754 { 00:23:28.754 "name": "NewBaseBdev", 00:23:28.754 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:28.754 "is_configured": true, 00:23:28.754 "data_offset": 0, 00:23:28.754 "data_size": 65536 00:23:28.754 }, 00:23:28.754 { 00:23:28.754 "name": "BaseBdev2", 00:23:28.754 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:28.754 "is_configured": true, 00:23:28.754 "data_offset": 0, 00:23:28.754 "data_size": 65536 00:23:28.754 }, 00:23:28.754 { 00:23:28.754 "name": "BaseBdev3", 00:23:28.754 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:28.754 "is_configured": true, 00:23:28.754 "data_offset": 0, 00:23:28.754 "data_size": 65536 00:23:28.754 }, 00:23:28.754 { 00:23:28.754 "name": "BaseBdev4", 00:23:28.754 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:28.754 "is_configured": true, 00:23:28.754 "data_offset": 0, 00:23:28.754 "data_size": 65536 00:23:28.754 } 00:23:28.754 ] 00:23:28.754 }' 00:23:28.754 05:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:28.754 05:51:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:29.320 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:23:29.320 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:29.320 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:29.320 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:29.320 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:29.320 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:29.320 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:29.320 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:29.579 [2024-07-26 05:51:44.240157] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:29.579 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:29.579 "name": "Existed_Raid", 00:23:29.579 "aliases": [ 00:23:29.579 "5e81f8c4-6b19-40b3-8a73-ff43cce06ae0" 00:23:29.579 ], 00:23:29.579 "product_name": "Raid Volume", 00:23:29.579 "block_size": 512, 00:23:29.579 "num_blocks": 65536, 00:23:29.579 "uuid": "5e81f8c4-6b19-40b3-8a73-ff43cce06ae0", 00:23:29.579 "assigned_rate_limits": { 00:23:29.579 "rw_ios_per_sec": 0, 00:23:29.579 "rw_mbytes_per_sec": 0, 00:23:29.579 "r_mbytes_per_sec": 0, 00:23:29.579 "w_mbytes_per_sec": 0 00:23:29.579 }, 00:23:29.579 "claimed": false, 00:23:29.579 "zoned": false, 00:23:29.579 "supported_io_types": { 00:23:29.579 "read": true, 00:23:29.579 "write": true, 00:23:29.579 "unmap": false, 00:23:29.579 "flush": false, 00:23:29.579 "reset": true, 00:23:29.579 "nvme_admin": false, 00:23:29.579 "nvme_io": false, 00:23:29.579 "nvme_io_md": false, 00:23:29.579 "write_zeroes": true, 00:23:29.579 "zcopy": false, 00:23:29.579 "get_zone_info": false, 00:23:29.579 "zone_management": false, 00:23:29.579 "zone_append": false, 00:23:29.579 "compare": false, 00:23:29.579 "compare_and_write": false, 00:23:29.579 "abort": false, 00:23:29.579 "seek_hole": false, 00:23:29.579 "seek_data": false, 00:23:29.579 "copy": false, 00:23:29.579 "nvme_iov_md": false 00:23:29.579 }, 00:23:29.579 "memory_domains": [ 00:23:29.579 { 00:23:29.579 "dma_device_id": "system", 00:23:29.579 "dma_device_type": 1 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:29.579 "dma_device_type": 2 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "dma_device_id": "system", 00:23:29.579 "dma_device_type": 1 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:29.579 "dma_device_type": 2 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "dma_device_id": "system", 00:23:29.579 "dma_device_type": 1 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:29.579 "dma_device_type": 2 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "dma_device_id": "system", 00:23:29.579 "dma_device_type": 1 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:29.579 "dma_device_type": 2 00:23:29.579 } 00:23:29.579 ], 00:23:29.579 "driver_specific": { 00:23:29.579 "raid": { 00:23:29.579 "uuid": "5e81f8c4-6b19-40b3-8a73-ff43cce06ae0", 00:23:29.579 "strip_size_kb": 0, 00:23:29.579 "state": "online", 00:23:29.579 "raid_level": "raid1", 00:23:29.579 "superblock": false, 00:23:29.579 "num_base_bdevs": 4, 00:23:29.579 "num_base_bdevs_discovered": 4, 00:23:29.579 "num_base_bdevs_operational": 4, 00:23:29.579 "base_bdevs_list": [ 00:23:29.579 { 00:23:29.579 "name": "NewBaseBdev", 00:23:29.579 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:29.579 "is_configured": true, 00:23:29.579 "data_offset": 0, 00:23:29.579 "data_size": 65536 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "name": "BaseBdev2", 00:23:29.579 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:29.579 "is_configured": true, 00:23:29.579 "data_offset": 0, 00:23:29.579 "data_size": 65536 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "name": "BaseBdev3", 00:23:29.579 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:29.579 "is_configured": true, 00:23:29.579 "data_offset": 0, 00:23:29.579 "data_size": 65536 00:23:29.579 }, 00:23:29.579 { 00:23:29.579 "name": "BaseBdev4", 00:23:29.579 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:29.579 "is_configured": true, 00:23:29.579 "data_offset": 0, 00:23:29.579 "data_size": 65536 00:23:29.579 } 00:23:29.579 ] 00:23:29.579 } 00:23:29.579 } 00:23:29.579 }' 00:23:29.579 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:29.579 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:23:29.579 BaseBdev2 00:23:29.579 BaseBdev3 00:23:29.579 BaseBdev4' 00:23:29.579 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:29.579 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:23:29.579 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:29.838 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:29.838 "name": "NewBaseBdev", 00:23:29.838 "aliases": [ 00:23:29.838 "f6a5117d-b359-4b6d-be23-0ea6130bc4ae" 00:23:29.838 ], 00:23:29.838 "product_name": "Malloc disk", 00:23:29.838 "block_size": 512, 00:23:29.838 "num_blocks": 65536, 00:23:29.838 "uuid": "f6a5117d-b359-4b6d-be23-0ea6130bc4ae", 00:23:29.838 "assigned_rate_limits": { 00:23:29.838 "rw_ios_per_sec": 0, 00:23:29.838 "rw_mbytes_per_sec": 0, 00:23:29.838 "r_mbytes_per_sec": 0, 00:23:29.838 "w_mbytes_per_sec": 0 00:23:29.838 }, 00:23:29.838 "claimed": true, 00:23:29.838 "claim_type": "exclusive_write", 00:23:29.838 "zoned": false, 00:23:29.838 "supported_io_types": { 00:23:29.838 "read": true, 00:23:29.838 "write": true, 00:23:29.838 "unmap": true, 00:23:29.838 "flush": true, 00:23:29.838 "reset": true, 00:23:29.838 "nvme_admin": false, 00:23:29.838 "nvme_io": false, 00:23:29.838 "nvme_io_md": false, 00:23:29.838 "write_zeroes": true, 00:23:29.838 "zcopy": true, 00:23:29.838 "get_zone_info": false, 00:23:29.838 "zone_management": false, 00:23:29.838 "zone_append": false, 00:23:29.838 "compare": false, 00:23:29.838 "compare_and_write": false, 00:23:29.838 "abort": true, 00:23:29.838 "seek_hole": false, 00:23:29.838 "seek_data": false, 00:23:29.838 "copy": true, 00:23:29.838 "nvme_iov_md": false 00:23:29.838 }, 00:23:29.838 "memory_domains": [ 00:23:29.838 { 00:23:29.838 "dma_device_id": "system", 00:23:29.838 "dma_device_type": 1 00:23:29.838 }, 00:23:29.838 { 00:23:29.838 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:29.838 "dma_device_type": 2 00:23:29.838 } 00:23:29.838 ], 00:23:29.838 "driver_specific": {} 00:23:29.838 }' 00:23:29.838 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:29.838 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:29.838 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:29.838 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:29.838 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:30.095 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:30.096 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:30.096 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:30.096 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:30.096 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:30.096 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:30.096 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:30.096 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:30.096 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:30.096 05:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:30.353 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:30.353 "name": "BaseBdev2", 00:23:30.353 "aliases": [ 00:23:30.353 "de51093a-fc0c-47ab-b686-0fc79e75ef6e" 00:23:30.353 ], 00:23:30.353 "product_name": "Malloc disk", 00:23:30.353 "block_size": 512, 00:23:30.353 "num_blocks": 65536, 00:23:30.353 "uuid": "de51093a-fc0c-47ab-b686-0fc79e75ef6e", 00:23:30.353 "assigned_rate_limits": { 00:23:30.353 "rw_ios_per_sec": 0, 00:23:30.353 "rw_mbytes_per_sec": 0, 00:23:30.353 "r_mbytes_per_sec": 0, 00:23:30.354 "w_mbytes_per_sec": 0 00:23:30.354 }, 00:23:30.354 "claimed": true, 00:23:30.354 "claim_type": "exclusive_write", 00:23:30.354 "zoned": false, 00:23:30.354 "supported_io_types": { 00:23:30.354 "read": true, 00:23:30.354 "write": true, 00:23:30.354 "unmap": true, 00:23:30.354 "flush": true, 00:23:30.354 "reset": true, 00:23:30.354 "nvme_admin": false, 00:23:30.354 "nvme_io": false, 00:23:30.354 "nvme_io_md": false, 00:23:30.354 "write_zeroes": true, 00:23:30.354 "zcopy": true, 00:23:30.354 "get_zone_info": false, 00:23:30.354 "zone_management": false, 00:23:30.354 "zone_append": false, 00:23:30.354 "compare": false, 00:23:30.354 "compare_and_write": false, 00:23:30.354 "abort": true, 00:23:30.354 "seek_hole": false, 00:23:30.354 "seek_data": false, 00:23:30.354 "copy": true, 00:23:30.354 "nvme_iov_md": false 00:23:30.354 }, 00:23:30.354 "memory_domains": [ 00:23:30.354 { 00:23:30.354 "dma_device_id": "system", 00:23:30.354 "dma_device_type": 1 00:23:30.354 }, 00:23:30.354 { 00:23:30.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:30.354 "dma_device_type": 2 00:23:30.354 } 00:23:30.354 ], 00:23:30.354 "driver_specific": {} 00:23:30.354 }' 00:23:30.354 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:30.354 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:30.354 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:30.354 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:30.612 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:30.612 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:30.612 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:30.612 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:30.612 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:30.612 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:30.612 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:30.870 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:30.870 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:30.870 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:30.870 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:30.870 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:30.870 "name": "BaseBdev3", 00:23:30.870 "aliases": [ 00:23:30.870 "9a9152aa-138b-4ece-837f-f7f498ae5324" 00:23:30.870 ], 00:23:30.870 "product_name": "Malloc disk", 00:23:30.870 "block_size": 512, 00:23:30.870 "num_blocks": 65536, 00:23:30.870 "uuid": "9a9152aa-138b-4ece-837f-f7f498ae5324", 00:23:30.870 "assigned_rate_limits": { 00:23:30.870 "rw_ios_per_sec": 0, 00:23:30.870 "rw_mbytes_per_sec": 0, 00:23:30.870 "r_mbytes_per_sec": 0, 00:23:30.870 "w_mbytes_per_sec": 0 00:23:30.870 }, 00:23:30.870 "claimed": true, 00:23:30.870 "claim_type": "exclusive_write", 00:23:30.870 "zoned": false, 00:23:30.870 "supported_io_types": { 00:23:30.870 "read": true, 00:23:30.870 "write": true, 00:23:30.870 "unmap": true, 00:23:30.870 "flush": true, 00:23:30.870 "reset": true, 00:23:30.870 "nvme_admin": false, 00:23:30.870 "nvme_io": false, 00:23:30.870 "nvme_io_md": false, 00:23:30.870 "write_zeroes": true, 00:23:30.870 "zcopy": true, 00:23:30.870 "get_zone_info": false, 00:23:30.870 "zone_management": false, 00:23:30.870 "zone_append": false, 00:23:30.870 "compare": false, 00:23:30.870 "compare_and_write": false, 00:23:30.870 "abort": true, 00:23:30.870 "seek_hole": false, 00:23:30.870 "seek_data": false, 00:23:30.870 "copy": true, 00:23:30.870 "nvme_iov_md": false 00:23:30.870 }, 00:23:30.870 "memory_domains": [ 00:23:30.870 { 00:23:30.870 "dma_device_id": "system", 00:23:30.870 "dma_device_type": 1 00:23:30.870 }, 00:23:30.870 { 00:23:30.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:30.870 "dma_device_type": 2 00:23:30.870 } 00:23:30.870 ], 00:23:30.870 "driver_specific": {} 00:23:30.870 }' 00:23:30.870 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.128 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.128 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:31.128 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.128 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.128 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:31.128 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:31.128 05:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:31.128 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:31.128 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:31.386 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:31.386 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:31.386 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:31.386 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:31.386 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:31.644 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:31.644 "name": "BaseBdev4", 00:23:31.644 "aliases": [ 00:23:31.644 "6adade06-315a-4f5c-a6de-7af062a327d5" 00:23:31.644 ], 00:23:31.644 "product_name": "Malloc disk", 00:23:31.644 "block_size": 512, 00:23:31.644 "num_blocks": 65536, 00:23:31.644 "uuid": "6adade06-315a-4f5c-a6de-7af062a327d5", 00:23:31.644 "assigned_rate_limits": { 00:23:31.644 "rw_ios_per_sec": 0, 00:23:31.644 "rw_mbytes_per_sec": 0, 00:23:31.644 "r_mbytes_per_sec": 0, 00:23:31.644 "w_mbytes_per_sec": 0 00:23:31.644 }, 00:23:31.644 "claimed": true, 00:23:31.644 "claim_type": "exclusive_write", 00:23:31.644 "zoned": false, 00:23:31.644 "supported_io_types": { 00:23:31.644 "read": true, 00:23:31.644 "write": true, 00:23:31.644 "unmap": true, 00:23:31.644 "flush": true, 00:23:31.644 "reset": true, 00:23:31.644 "nvme_admin": false, 00:23:31.644 "nvme_io": false, 00:23:31.644 "nvme_io_md": false, 00:23:31.644 "write_zeroes": true, 00:23:31.644 "zcopy": true, 00:23:31.644 "get_zone_info": false, 00:23:31.644 "zone_management": false, 00:23:31.644 "zone_append": false, 00:23:31.644 "compare": false, 00:23:31.644 "compare_and_write": false, 00:23:31.644 "abort": true, 00:23:31.644 "seek_hole": false, 00:23:31.644 "seek_data": false, 00:23:31.644 "copy": true, 00:23:31.644 "nvme_iov_md": false 00:23:31.644 }, 00:23:31.644 "memory_domains": [ 00:23:31.644 { 00:23:31.644 "dma_device_id": "system", 00:23:31.644 "dma_device_type": 1 00:23:31.644 }, 00:23:31.644 { 00:23:31.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:31.644 "dma_device_type": 2 00:23:31.644 } 00:23:31.644 ], 00:23:31.644 "driver_specific": {} 00:23:31.644 }' 00:23:31.644 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.644 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:31.644 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:31.644 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.644 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:31.644 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:31.644 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:31.644 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:31.902 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:31.902 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:31.902 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:31.902 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:31.902 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:32.161 [2024-07-26 05:51:46.898921] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:32.161 [2024-07-26 05:51:46.898945] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:32.161 [2024-07-26 05:51:46.898999] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:32.161 [2024-07-26 05:51:46.899277] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:32.161 [2024-07-26 05:51:46.899289] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2492610 name Existed_Raid, state offline 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1218082 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1218082 ']' 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1218082 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1218082 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1218082' 00:23:32.161 killing process with pid 1218082 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1218082 00:23:32.161 [2024-07-26 05:51:46.967552] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:32.161 05:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1218082 00:23:32.161 [2024-07-26 05:51:47.003871] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:23:32.419 00:23:32.419 real 0m32.741s 00:23:32.419 user 1m0.099s 00:23:32.419 sys 0m5.851s 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:32.419 ************************************ 00:23:32.419 END TEST raid_state_function_test 00:23:32.419 ************************************ 00:23:32.419 05:51:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:32.419 05:51:47 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:23:32.419 05:51:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:32.419 05:51:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:32.419 05:51:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:32.419 ************************************ 00:23:32.419 START TEST raid_state_function_test_sb 00:23:32.419 ************************************ 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1222946 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1222946' 00:23:32.419 Process raid pid: 1222946 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1222946 /var/tmp/spdk-raid.sock 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1222946 ']' 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:32.419 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:32.419 05:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:32.732 [2024-07-26 05:51:47.352865] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:23:32.732 [2024-07-26 05:51:47.352928] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:32.732 [2024-07-26 05:51:47.484313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:32.732 [2024-07-26 05:51:47.591167] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:32.990 [2024-07-26 05:51:47.656401] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:32.990 [2024-07-26 05:51:47.656437] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:33.555 05:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:33.555 05:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:33.555 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:33.813 [2024-07-26 05:51:48.515659] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:33.813 [2024-07-26 05:51:48.515696] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:33.813 [2024-07-26 05:51:48.515707] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:33.813 [2024-07-26 05:51:48.515719] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:33.813 [2024-07-26 05:51:48.515728] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:33.813 [2024-07-26 05:51:48.515739] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:33.813 [2024-07-26 05:51:48.515747] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:33.813 [2024-07-26 05:51:48.515758] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.813 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:34.071 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.071 "name": "Existed_Raid", 00:23:34.071 "uuid": "779ba861-a08d-4b15-b263-517e4dc969be", 00:23:34.071 "strip_size_kb": 0, 00:23:34.071 "state": "configuring", 00:23:34.072 "raid_level": "raid1", 00:23:34.072 "superblock": true, 00:23:34.072 "num_base_bdevs": 4, 00:23:34.072 "num_base_bdevs_discovered": 0, 00:23:34.072 "num_base_bdevs_operational": 4, 00:23:34.072 "base_bdevs_list": [ 00:23:34.072 { 00:23:34.072 "name": "BaseBdev1", 00:23:34.072 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.072 "is_configured": false, 00:23:34.072 "data_offset": 0, 00:23:34.072 "data_size": 0 00:23:34.072 }, 00:23:34.072 { 00:23:34.072 "name": "BaseBdev2", 00:23:34.072 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.072 "is_configured": false, 00:23:34.072 "data_offset": 0, 00:23:34.072 "data_size": 0 00:23:34.072 }, 00:23:34.072 { 00:23:34.072 "name": "BaseBdev3", 00:23:34.072 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.072 "is_configured": false, 00:23:34.072 "data_offset": 0, 00:23:34.072 "data_size": 0 00:23:34.072 }, 00:23:34.072 { 00:23:34.072 "name": "BaseBdev4", 00:23:34.072 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.072 "is_configured": false, 00:23:34.072 "data_offset": 0, 00:23:34.072 "data_size": 0 00:23:34.072 } 00:23:34.072 ] 00:23:34.072 }' 00:23:34.072 05:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.072 05:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:34.638 05:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:34.638 [2024-07-26 05:51:49.506125] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:34.638 [2024-07-26 05:51:49.506155] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25ebaa0 name Existed_Raid, state configuring 00:23:34.638 05:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:34.896 [2024-07-26 05:51:49.754806] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:34.896 [2024-07-26 05:51:49.754833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:34.896 [2024-07-26 05:51:49.754842] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:34.896 [2024-07-26 05:51:49.754854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:34.896 [2024-07-26 05:51:49.754863] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:34.896 [2024-07-26 05:51:49.754875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:34.896 [2024-07-26 05:51:49.754884] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:34.896 [2024-07-26 05:51:49.754894] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:34.896 05:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:35.154 [2024-07-26 05:51:49.945216] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:35.154 BaseBdev1 00:23:35.154 05:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:35.154 05:51:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:23:35.154 05:51:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:35.154 05:51:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:35.154 05:51:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:35.154 05:51:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:35.154 05:51:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:35.412 05:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:35.670 [ 00:23:35.670 { 00:23:35.671 "name": "BaseBdev1", 00:23:35.671 "aliases": [ 00:23:35.671 "4307a908-f41c-4065-a5e5-2a728e006c44" 00:23:35.671 ], 00:23:35.671 "product_name": "Malloc disk", 00:23:35.671 "block_size": 512, 00:23:35.671 "num_blocks": 65536, 00:23:35.671 "uuid": "4307a908-f41c-4065-a5e5-2a728e006c44", 00:23:35.671 "assigned_rate_limits": { 00:23:35.671 "rw_ios_per_sec": 0, 00:23:35.671 "rw_mbytes_per_sec": 0, 00:23:35.671 "r_mbytes_per_sec": 0, 00:23:35.671 "w_mbytes_per_sec": 0 00:23:35.671 }, 00:23:35.671 "claimed": true, 00:23:35.671 "claim_type": "exclusive_write", 00:23:35.671 "zoned": false, 00:23:35.671 "supported_io_types": { 00:23:35.671 "read": true, 00:23:35.671 "write": true, 00:23:35.671 "unmap": true, 00:23:35.671 "flush": true, 00:23:35.671 "reset": true, 00:23:35.671 "nvme_admin": false, 00:23:35.671 "nvme_io": false, 00:23:35.671 "nvme_io_md": false, 00:23:35.671 "write_zeroes": true, 00:23:35.671 "zcopy": true, 00:23:35.671 "get_zone_info": false, 00:23:35.671 "zone_management": false, 00:23:35.671 "zone_append": false, 00:23:35.671 "compare": false, 00:23:35.671 "compare_and_write": false, 00:23:35.671 "abort": true, 00:23:35.671 "seek_hole": false, 00:23:35.671 "seek_data": false, 00:23:35.671 "copy": true, 00:23:35.671 "nvme_iov_md": false 00:23:35.671 }, 00:23:35.671 "memory_domains": [ 00:23:35.671 { 00:23:35.671 "dma_device_id": "system", 00:23:35.671 "dma_device_type": 1 00:23:35.671 }, 00:23:35.671 { 00:23:35.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:35.671 "dma_device_type": 2 00:23:35.671 } 00:23:35.671 ], 00:23:35.671 "driver_specific": {} 00:23:35.671 } 00:23:35.671 ] 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.671 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:35.929 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:35.929 "name": "Existed_Raid", 00:23:35.929 "uuid": "6c735223-0e18-4088-a52e-2966aaff4032", 00:23:35.929 "strip_size_kb": 0, 00:23:35.929 "state": "configuring", 00:23:35.929 "raid_level": "raid1", 00:23:35.929 "superblock": true, 00:23:35.929 "num_base_bdevs": 4, 00:23:35.929 "num_base_bdevs_discovered": 1, 00:23:35.929 "num_base_bdevs_operational": 4, 00:23:35.929 "base_bdevs_list": [ 00:23:35.929 { 00:23:35.929 "name": "BaseBdev1", 00:23:35.929 "uuid": "4307a908-f41c-4065-a5e5-2a728e006c44", 00:23:35.929 "is_configured": true, 00:23:35.929 "data_offset": 2048, 00:23:35.929 "data_size": 63488 00:23:35.929 }, 00:23:35.929 { 00:23:35.929 "name": "BaseBdev2", 00:23:35.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.929 "is_configured": false, 00:23:35.929 "data_offset": 0, 00:23:35.929 "data_size": 0 00:23:35.929 }, 00:23:35.929 { 00:23:35.929 "name": "BaseBdev3", 00:23:35.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.929 "is_configured": false, 00:23:35.929 "data_offset": 0, 00:23:35.929 "data_size": 0 00:23:35.929 }, 00:23:35.929 { 00:23:35.929 "name": "BaseBdev4", 00:23:35.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.929 "is_configured": false, 00:23:35.929 "data_offset": 0, 00:23:35.929 "data_size": 0 00:23:35.929 } 00:23:35.929 ] 00:23:35.929 }' 00:23:35.929 05:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:35.929 05:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:36.495 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:36.495 [2024-07-26 05:51:51.316841] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:36.495 [2024-07-26 05:51:51.316875] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25eb310 name Existed_Raid, state configuring 00:23:36.495 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:36.753 [2024-07-26 05:51:51.561533] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:36.753 [2024-07-26 05:51:51.562970] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:36.753 [2024-07-26 05:51:51.563002] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:36.753 [2024-07-26 05:51:51.563012] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:36.753 [2024-07-26 05:51:51.563024] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:36.753 [2024-07-26 05:51:51.563033] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:36.753 [2024-07-26 05:51:51.563048] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.753 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:37.013 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:37.013 "name": "Existed_Raid", 00:23:37.013 "uuid": "a408faf1-e327-4094-bd79-63060f7195eb", 00:23:37.013 "strip_size_kb": 0, 00:23:37.013 "state": "configuring", 00:23:37.013 "raid_level": "raid1", 00:23:37.013 "superblock": true, 00:23:37.013 "num_base_bdevs": 4, 00:23:37.013 "num_base_bdevs_discovered": 1, 00:23:37.013 "num_base_bdevs_operational": 4, 00:23:37.013 "base_bdevs_list": [ 00:23:37.013 { 00:23:37.013 "name": "BaseBdev1", 00:23:37.013 "uuid": "4307a908-f41c-4065-a5e5-2a728e006c44", 00:23:37.013 "is_configured": true, 00:23:37.013 "data_offset": 2048, 00:23:37.013 "data_size": 63488 00:23:37.013 }, 00:23:37.013 { 00:23:37.013 "name": "BaseBdev2", 00:23:37.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.013 "is_configured": false, 00:23:37.013 "data_offset": 0, 00:23:37.013 "data_size": 0 00:23:37.013 }, 00:23:37.013 { 00:23:37.013 "name": "BaseBdev3", 00:23:37.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.013 "is_configured": false, 00:23:37.013 "data_offset": 0, 00:23:37.013 "data_size": 0 00:23:37.013 }, 00:23:37.013 { 00:23:37.013 "name": "BaseBdev4", 00:23:37.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.013 "is_configured": false, 00:23:37.013 "data_offset": 0, 00:23:37.013 "data_size": 0 00:23:37.013 } 00:23:37.013 ] 00:23:37.013 }' 00:23:37.013 05:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:37.013 05:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:37.612 05:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:37.870 [2024-07-26 05:51:52.587593] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:37.870 BaseBdev2 00:23:37.870 05:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:37.870 05:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:23:37.870 05:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:37.870 05:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:37.870 05:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:37.870 05:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:37.870 05:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:38.128 05:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:38.128 [ 00:23:38.128 { 00:23:38.128 "name": "BaseBdev2", 00:23:38.128 "aliases": [ 00:23:38.128 "b6eb905c-4128-4c68-bef9-0b735fa623bc" 00:23:38.128 ], 00:23:38.128 "product_name": "Malloc disk", 00:23:38.128 "block_size": 512, 00:23:38.128 "num_blocks": 65536, 00:23:38.128 "uuid": "b6eb905c-4128-4c68-bef9-0b735fa623bc", 00:23:38.128 "assigned_rate_limits": { 00:23:38.128 "rw_ios_per_sec": 0, 00:23:38.128 "rw_mbytes_per_sec": 0, 00:23:38.128 "r_mbytes_per_sec": 0, 00:23:38.128 "w_mbytes_per_sec": 0 00:23:38.128 }, 00:23:38.128 "claimed": true, 00:23:38.128 "claim_type": "exclusive_write", 00:23:38.128 "zoned": false, 00:23:38.128 "supported_io_types": { 00:23:38.128 "read": true, 00:23:38.128 "write": true, 00:23:38.128 "unmap": true, 00:23:38.128 "flush": true, 00:23:38.128 "reset": true, 00:23:38.128 "nvme_admin": false, 00:23:38.128 "nvme_io": false, 00:23:38.128 "nvme_io_md": false, 00:23:38.128 "write_zeroes": true, 00:23:38.128 "zcopy": true, 00:23:38.128 "get_zone_info": false, 00:23:38.128 "zone_management": false, 00:23:38.128 "zone_append": false, 00:23:38.128 "compare": false, 00:23:38.128 "compare_and_write": false, 00:23:38.128 "abort": true, 00:23:38.128 "seek_hole": false, 00:23:38.128 "seek_data": false, 00:23:38.128 "copy": true, 00:23:38.128 "nvme_iov_md": false 00:23:38.128 }, 00:23:38.128 "memory_domains": [ 00:23:38.128 { 00:23:38.128 "dma_device_id": "system", 00:23:38.128 "dma_device_type": 1 00:23:38.128 }, 00:23:38.128 { 00:23:38.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:38.128 "dma_device_type": 2 00:23:38.128 } 00:23:38.128 ], 00:23:38.128 "driver_specific": {} 00:23:38.128 } 00:23:38.128 ] 00:23:38.128 05:51:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:38.128 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:38.128 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:38.128 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:38.128 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:38.128 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:38.128 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:38.386 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:38.386 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:38.386 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:38.386 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:38.386 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:38.386 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:38.387 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.387 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:38.387 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:38.387 "name": "Existed_Raid", 00:23:38.387 "uuid": "a408faf1-e327-4094-bd79-63060f7195eb", 00:23:38.387 "strip_size_kb": 0, 00:23:38.387 "state": "configuring", 00:23:38.387 "raid_level": "raid1", 00:23:38.387 "superblock": true, 00:23:38.387 "num_base_bdevs": 4, 00:23:38.387 "num_base_bdevs_discovered": 2, 00:23:38.387 "num_base_bdevs_operational": 4, 00:23:38.387 "base_bdevs_list": [ 00:23:38.387 { 00:23:38.387 "name": "BaseBdev1", 00:23:38.387 "uuid": "4307a908-f41c-4065-a5e5-2a728e006c44", 00:23:38.387 "is_configured": true, 00:23:38.387 "data_offset": 2048, 00:23:38.387 "data_size": 63488 00:23:38.387 }, 00:23:38.387 { 00:23:38.387 "name": "BaseBdev2", 00:23:38.387 "uuid": "b6eb905c-4128-4c68-bef9-0b735fa623bc", 00:23:38.387 "is_configured": true, 00:23:38.387 "data_offset": 2048, 00:23:38.387 "data_size": 63488 00:23:38.387 }, 00:23:38.387 { 00:23:38.387 "name": "BaseBdev3", 00:23:38.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.387 "is_configured": false, 00:23:38.387 "data_offset": 0, 00:23:38.387 "data_size": 0 00:23:38.387 }, 00:23:38.387 { 00:23:38.387 "name": "BaseBdev4", 00:23:38.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:38.387 "is_configured": false, 00:23:38.387 "data_offset": 0, 00:23:38.387 "data_size": 0 00:23:38.387 } 00:23:38.387 ] 00:23:38.387 }' 00:23:38.387 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:38.387 05:51:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:38.950 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:39.209 [2024-07-26 05:51:53.950612] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:39.209 BaseBdev3 00:23:39.209 05:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:23:39.209 05:51:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:23:39.209 05:51:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:39.209 05:51:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:39.209 05:51:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:39.209 05:51:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:39.209 05:51:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:39.467 05:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:39.726 [ 00:23:39.726 { 00:23:39.726 "name": "BaseBdev3", 00:23:39.726 "aliases": [ 00:23:39.726 "55cdba68-9f4e-4546-a257-0566e22694c0" 00:23:39.726 ], 00:23:39.726 "product_name": "Malloc disk", 00:23:39.726 "block_size": 512, 00:23:39.726 "num_blocks": 65536, 00:23:39.726 "uuid": "55cdba68-9f4e-4546-a257-0566e22694c0", 00:23:39.726 "assigned_rate_limits": { 00:23:39.726 "rw_ios_per_sec": 0, 00:23:39.726 "rw_mbytes_per_sec": 0, 00:23:39.726 "r_mbytes_per_sec": 0, 00:23:39.726 "w_mbytes_per_sec": 0 00:23:39.726 }, 00:23:39.726 "claimed": true, 00:23:39.726 "claim_type": "exclusive_write", 00:23:39.726 "zoned": false, 00:23:39.726 "supported_io_types": { 00:23:39.726 "read": true, 00:23:39.726 "write": true, 00:23:39.726 "unmap": true, 00:23:39.726 "flush": true, 00:23:39.726 "reset": true, 00:23:39.726 "nvme_admin": false, 00:23:39.726 "nvme_io": false, 00:23:39.726 "nvme_io_md": false, 00:23:39.726 "write_zeroes": true, 00:23:39.726 "zcopy": true, 00:23:39.726 "get_zone_info": false, 00:23:39.726 "zone_management": false, 00:23:39.726 "zone_append": false, 00:23:39.726 "compare": false, 00:23:39.726 "compare_and_write": false, 00:23:39.726 "abort": true, 00:23:39.726 "seek_hole": false, 00:23:39.726 "seek_data": false, 00:23:39.726 "copy": true, 00:23:39.726 "nvme_iov_md": false 00:23:39.726 }, 00:23:39.726 "memory_domains": [ 00:23:39.726 { 00:23:39.726 "dma_device_id": "system", 00:23:39.726 "dma_device_type": 1 00:23:39.726 }, 00:23:39.726 { 00:23:39.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:39.726 "dma_device_type": 2 00:23:39.726 } 00:23:39.726 ], 00:23:39.726 "driver_specific": {} 00:23:39.726 } 00:23:39.726 ] 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.726 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.727 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.727 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:39.985 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.985 "name": "Existed_Raid", 00:23:39.985 "uuid": "a408faf1-e327-4094-bd79-63060f7195eb", 00:23:39.985 "strip_size_kb": 0, 00:23:39.985 "state": "configuring", 00:23:39.985 "raid_level": "raid1", 00:23:39.985 "superblock": true, 00:23:39.985 "num_base_bdevs": 4, 00:23:39.985 "num_base_bdevs_discovered": 3, 00:23:39.985 "num_base_bdevs_operational": 4, 00:23:39.985 "base_bdevs_list": [ 00:23:39.985 { 00:23:39.985 "name": "BaseBdev1", 00:23:39.985 "uuid": "4307a908-f41c-4065-a5e5-2a728e006c44", 00:23:39.985 "is_configured": true, 00:23:39.985 "data_offset": 2048, 00:23:39.985 "data_size": 63488 00:23:39.985 }, 00:23:39.985 { 00:23:39.985 "name": "BaseBdev2", 00:23:39.985 "uuid": "b6eb905c-4128-4c68-bef9-0b735fa623bc", 00:23:39.985 "is_configured": true, 00:23:39.985 "data_offset": 2048, 00:23:39.985 "data_size": 63488 00:23:39.985 }, 00:23:39.985 { 00:23:39.985 "name": "BaseBdev3", 00:23:39.985 "uuid": "55cdba68-9f4e-4546-a257-0566e22694c0", 00:23:39.985 "is_configured": true, 00:23:39.985 "data_offset": 2048, 00:23:39.985 "data_size": 63488 00:23:39.985 }, 00:23:39.985 { 00:23:39.985 "name": "BaseBdev4", 00:23:39.985 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:39.985 "is_configured": false, 00:23:39.985 "data_offset": 0, 00:23:39.985 "data_size": 0 00:23:39.985 } 00:23:39.985 ] 00:23:39.985 }' 00:23:39.985 05:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.985 05:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:40.551 05:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:40.809 [2024-07-26 05:51:55.523328] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:40.809 [2024-07-26 05:51:55.523504] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25ec350 00:23:40.809 [2024-07-26 05:51:55.523518] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:40.809 [2024-07-26 05:51:55.523704] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25ec020 00:23:40.809 [2024-07-26 05:51:55.523828] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25ec350 00:23:40.809 [2024-07-26 05:51:55.523839] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25ec350 00:23:40.809 [2024-07-26 05:51:55.523932] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:40.809 BaseBdev4 00:23:40.809 05:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:23:40.809 05:51:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:23:40.809 05:51:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:40.809 05:51:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:40.809 05:51:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:40.809 05:51:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:40.809 05:51:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:41.066 05:51:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:41.324 [ 00:23:41.324 { 00:23:41.324 "name": "BaseBdev4", 00:23:41.324 "aliases": [ 00:23:41.324 "ebf0f46e-c65c-49ec-a8ef-1a53913f830e" 00:23:41.324 ], 00:23:41.324 "product_name": "Malloc disk", 00:23:41.324 "block_size": 512, 00:23:41.324 "num_blocks": 65536, 00:23:41.324 "uuid": "ebf0f46e-c65c-49ec-a8ef-1a53913f830e", 00:23:41.324 "assigned_rate_limits": { 00:23:41.324 "rw_ios_per_sec": 0, 00:23:41.324 "rw_mbytes_per_sec": 0, 00:23:41.324 "r_mbytes_per_sec": 0, 00:23:41.324 "w_mbytes_per_sec": 0 00:23:41.324 }, 00:23:41.324 "claimed": true, 00:23:41.324 "claim_type": "exclusive_write", 00:23:41.324 "zoned": false, 00:23:41.324 "supported_io_types": { 00:23:41.324 "read": true, 00:23:41.324 "write": true, 00:23:41.324 "unmap": true, 00:23:41.324 "flush": true, 00:23:41.324 "reset": true, 00:23:41.324 "nvme_admin": false, 00:23:41.324 "nvme_io": false, 00:23:41.324 "nvme_io_md": false, 00:23:41.324 "write_zeroes": true, 00:23:41.324 "zcopy": true, 00:23:41.324 "get_zone_info": false, 00:23:41.324 "zone_management": false, 00:23:41.324 "zone_append": false, 00:23:41.324 "compare": false, 00:23:41.324 "compare_and_write": false, 00:23:41.324 "abort": true, 00:23:41.324 "seek_hole": false, 00:23:41.324 "seek_data": false, 00:23:41.324 "copy": true, 00:23:41.324 "nvme_iov_md": false 00:23:41.324 }, 00:23:41.324 "memory_domains": [ 00:23:41.324 { 00:23:41.324 "dma_device_id": "system", 00:23:41.324 "dma_device_type": 1 00:23:41.324 }, 00:23:41.324 { 00:23:41.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.324 "dma_device_type": 2 00:23:41.324 } 00:23:41.324 ], 00:23:41.324 "driver_specific": {} 00:23:41.324 } 00:23:41.324 ] 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.324 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:41.583 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:41.583 "name": "Existed_Raid", 00:23:41.583 "uuid": "a408faf1-e327-4094-bd79-63060f7195eb", 00:23:41.583 "strip_size_kb": 0, 00:23:41.583 "state": "online", 00:23:41.583 "raid_level": "raid1", 00:23:41.583 "superblock": true, 00:23:41.583 "num_base_bdevs": 4, 00:23:41.583 "num_base_bdevs_discovered": 4, 00:23:41.583 "num_base_bdevs_operational": 4, 00:23:41.583 "base_bdevs_list": [ 00:23:41.583 { 00:23:41.583 "name": "BaseBdev1", 00:23:41.583 "uuid": "4307a908-f41c-4065-a5e5-2a728e006c44", 00:23:41.583 "is_configured": true, 00:23:41.583 "data_offset": 2048, 00:23:41.583 "data_size": 63488 00:23:41.583 }, 00:23:41.583 { 00:23:41.583 "name": "BaseBdev2", 00:23:41.583 "uuid": "b6eb905c-4128-4c68-bef9-0b735fa623bc", 00:23:41.583 "is_configured": true, 00:23:41.583 "data_offset": 2048, 00:23:41.583 "data_size": 63488 00:23:41.583 }, 00:23:41.583 { 00:23:41.583 "name": "BaseBdev3", 00:23:41.583 "uuid": "55cdba68-9f4e-4546-a257-0566e22694c0", 00:23:41.583 "is_configured": true, 00:23:41.583 "data_offset": 2048, 00:23:41.583 "data_size": 63488 00:23:41.583 }, 00:23:41.583 { 00:23:41.583 "name": "BaseBdev4", 00:23:41.583 "uuid": "ebf0f46e-c65c-49ec-a8ef-1a53913f830e", 00:23:41.583 "is_configured": true, 00:23:41.583 "data_offset": 2048, 00:23:41.583 "data_size": 63488 00:23:41.583 } 00:23:41.583 ] 00:23:41.583 }' 00:23:41.583 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:41.583 05:51:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:42.148 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:42.148 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:42.148 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:42.148 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:42.148 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:42.148 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:23:42.148 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:42.148 05:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:42.407 [2024-07-26 05:51:57.103847] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:42.407 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:42.407 "name": "Existed_Raid", 00:23:42.407 "aliases": [ 00:23:42.407 "a408faf1-e327-4094-bd79-63060f7195eb" 00:23:42.407 ], 00:23:42.407 "product_name": "Raid Volume", 00:23:42.407 "block_size": 512, 00:23:42.407 "num_blocks": 63488, 00:23:42.407 "uuid": "a408faf1-e327-4094-bd79-63060f7195eb", 00:23:42.407 "assigned_rate_limits": { 00:23:42.407 "rw_ios_per_sec": 0, 00:23:42.407 "rw_mbytes_per_sec": 0, 00:23:42.407 "r_mbytes_per_sec": 0, 00:23:42.407 "w_mbytes_per_sec": 0 00:23:42.407 }, 00:23:42.407 "claimed": false, 00:23:42.407 "zoned": false, 00:23:42.407 "supported_io_types": { 00:23:42.407 "read": true, 00:23:42.407 "write": true, 00:23:42.407 "unmap": false, 00:23:42.407 "flush": false, 00:23:42.407 "reset": true, 00:23:42.407 "nvme_admin": false, 00:23:42.407 "nvme_io": false, 00:23:42.407 "nvme_io_md": false, 00:23:42.407 "write_zeroes": true, 00:23:42.407 "zcopy": false, 00:23:42.407 "get_zone_info": false, 00:23:42.407 "zone_management": false, 00:23:42.407 "zone_append": false, 00:23:42.407 "compare": false, 00:23:42.407 "compare_and_write": false, 00:23:42.407 "abort": false, 00:23:42.407 "seek_hole": false, 00:23:42.407 "seek_data": false, 00:23:42.407 "copy": false, 00:23:42.407 "nvme_iov_md": false 00:23:42.407 }, 00:23:42.407 "memory_domains": [ 00:23:42.407 { 00:23:42.407 "dma_device_id": "system", 00:23:42.407 "dma_device_type": 1 00:23:42.407 }, 00:23:42.407 { 00:23:42.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.407 "dma_device_type": 2 00:23:42.407 }, 00:23:42.407 { 00:23:42.407 "dma_device_id": "system", 00:23:42.407 "dma_device_type": 1 00:23:42.407 }, 00:23:42.407 { 00:23:42.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.407 "dma_device_type": 2 00:23:42.407 }, 00:23:42.407 { 00:23:42.407 "dma_device_id": "system", 00:23:42.407 "dma_device_type": 1 00:23:42.407 }, 00:23:42.407 { 00:23:42.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.407 "dma_device_type": 2 00:23:42.407 }, 00:23:42.407 { 00:23:42.407 "dma_device_id": "system", 00:23:42.407 "dma_device_type": 1 00:23:42.407 }, 00:23:42.407 { 00:23:42.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.407 "dma_device_type": 2 00:23:42.407 } 00:23:42.408 ], 00:23:42.408 "driver_specific": { 00:23:42.408 "raid": { 00:23:42.408 "uuid": "a408faf1-e327-4094-bd79-63060f7195eb", 00:23:42.408 "strip_size_kb": 0, 00:23:42.408 "state": "online", 00:23:42.408 "raid_level": "raid1", 00:23:42.408 "superblock": true, 00:23:42.408 "num_base_bdevs": 4, 00:23:42.408 "num_base_bdevs_discovered": 4, 00:23:42.408 "num_base_bdevs_operational": 4, 00:23:42.408 "base_bdevs_list": [ 00:23:42.408 { 00:23:42.408 "name": "BaseBdev1", 00:23:42.408 "uuid": "4307a908-f41c-4065-a5e5-2a728e006c44", 00:23:42.408 "is_configured": true, 00:23:42.408 "data_offset": 2048, 00:23:42.408 "data_size": 63488 00:23:42.408 }, 00:23:42.408 { 00:23:42.408 "name": "BaseBdev2", 00:23:42.408 "uuid": "b6eb905c-4128-4c68-bef9-0b735fa623bc", 00:23:42.408 "is_configured": true, 00:23:42.408 "data_offset": 2048, 00:23:42.408 "data_size": 63488 00:23:42.408 }, 00:23:42.408 { 00:23:42.408 "name": "BaseBdev3", 00:23:42.408 "uuid": "55cdba68-9f4e-4546-a257-0566e22694c0", 00:23:42.408 "is_configured": true, 00:23:42.408 "data_offset": 2048, 00:23:42.408 "data_size": 63488 00:23:42.408 }, 00:23:42.408 { 00:23:42.408 "name": "BaseBdev4", 00:23:42.408 "uuid": "ebf0f46e-c65c-49ec-a8ef-1a53913f830e", 00:23:42.408 "is_configured": true, 00:23:42.408 "data_offset": 2048, 00:23:42.408 "data_size": 63488 00:23:42.408 } 00:23:42.408 ] 00:23:42.408 } 00:23:42.408 } 00:23:42.408 }' 00:23:42.408 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:42.408 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:42.408 BaseBdev2 00:23:42.408 BaseBdev3 00:23:42.408 BaseBdev4' 00:23:42.408 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:42.408 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:42.408 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:42.666 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:42.666 "name": "BaseBdev1", 00:23:42.666 "aliases": [ 00:23:42.666 "4307a908-f41c-4065-a5e5-2a728e006c44" 00:23:42.666 ], 00:23:42.666 "product_name": "Malloc disk", 00:23:42.666 "block_size": 512, 00:23:42.666 "num_blocks": 65536, 00:23:42.666 "uuid": "4307a908-f41c-4065-a5e5-2a728e006c44", 00:23:42.666 "assigned_rate_limits": { 00:23:42.666 "rw_ios_per_sec": 0, 00:23:42.666 "rw_mbytes_per_sec": 0, 00:23:42.666 "r_mbytes_per_sec": 0, 00:23:42.666 "w_mbytes_per_sec": 0 00:23:42.666 }, 00:23:42.666 "claimed": true, 00:23:42.666 "claim_type": "exclusive_write", 00:23:42.666 "zoned": false, 00:23:42.666 "supported_io_types": { 00:23:42.666 "read": true, 00:23:42.666 "write": true, 00:23:42.666 "unmap": true, 00:23:42.666 "flush": true, 00:23:42.666 "reset": true, 00:23:42.666 "nvme_admin": false, 00:23:42.666 "nvme_io": false, 00:23:42.666 "nvme_io_md": false, 00:23:42.666 "write_zeroes": true, 00:23:42.666 "zcopy": true, 00:23:42.666 "get_zone_info": false, 00:23:42.666 "zone_management": false, 00:23:42.666 "zone_append": false, 00:23:42.666 "compare": false, 00:23:42.666 "compare_and_write": false, 00:23:42.666 "abort": true, 00:23:42.666 "seek_hole": false, 00:23:42.666 "seek_data": false, 00:23:42.666 "copy": true, 00:23:42.666 "nvme_iov_md": false 00:23:42.666 }, 00:23:42.666 "memory_domains": [ 00:23:42.666 { 00:23:42.666 "dma_device_id": "system", 00:23:42.666 "dma_device_type": 1 00:23:42.666 }, 00:23:42.666 { 00:23:42.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.666 "dma_device_type": 2 00:23:42.666 } 00:23:42.666 ], 00:23:42.666 "driver_specific": {} 00:23:42.666 }' 00:23:42.666 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:42.666 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:42.666 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:42.666 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:42.666 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:42.925 05:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:43.183 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:43.183 "name": "BaseBdev2", 00:23:43.183 "aliases": [ 00:23:43.183 "b6eb905c-4128-4c68-bef9-0b735fa623bc" 00:23:43.183 ], 00:23:43.183 "product_name": "Malloc disk", 00:23:43.183 "block_size": 512, 00:23:43.183 "num_blocks": 65536, 00:23:43.183 "uuid": "b6eb905c-4128-4c68-bef9-0b735fa623bc", 00:23:43.183 "assigned_rate_limits": { 00:23:43.183 "rw_ios_per_sec": 0, 00:23:43.183 "rw_mbytes_per_sec": 0, 00:23:43.183 "r_mbytes_per_sec": 0, 00:23:43.183 "w_mbytes_per_sec": 0 00:23:43.183 }, 00:23:43.183 "claimed": true, 00:23:43.183 "claim_type": "exclusive_write", 00:23:43.183 "zoned": false, 00:23:43.183 "supported_io_types": { 00:23:43.183 "read": true, 00:23:43.183 "write": true, 00:23:43.183 "unmap": true, 00:23:43.183 "flush": true, 00:23:43.183 "reset": true, 00:23:43.183 "nvme_admin": false, 00:23:43.183 "nvme_io": false, 00:23:43.183 "nvme_io_md": false, 00:23:43.183 "write_zeroes": true, 00:23:43.183 "zcopy": true, 00:23:43.183 "get_zone_info": false, 00:23:43.183 "zone_management": false, 00:23:43.183 "zone_append": false, 00:23:43.183 "compare": false, 00:23:43.183 "compare_and_write": false, 00:23:43.183 "abort": true, 00:23:43.183 "seek_hole": false, 00:23:43.183 "seek_data": false, 00:23:43.183 "copy": true, 00:23:43.183 "nvme_iov_md": false 00:23:43.183 }, 00:23:43.183 "memory_domains": [ 00:23:43.183 { 00:23:43.183 "dma_device_id": "system", 00:23:43.183 "dma_device_type": 1 00:23:43.183 }, 00:23:43.183 { 00:23:43.183 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:43.183 "dma_device_type": 2 00:23:43.183 } 00:23:43.183 ], 00:23:43.183 "driver_specific": {} 00:23:43.183 }' 00:23:43.183 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.183 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.442 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:43.442 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.442 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.442 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:43.442 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.442 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.442 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:43.442 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:43.442 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:43.700 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:43.700 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:43.700 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:43.700 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:43.958 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:43.958 "name": "BaseBdev3", 00:23:43.958 "aliases": [ 00:23:43.958 "55cdba68-9f4e-4546-a257-0566e22694c0" 00:23:43.958 ], 00:23:43.958 "product_name": "Malloc disk", 00:23:43.958 "block_size": 512, 00:23:43.958 "num_blocks": 65536, 00:23:43.958 "uuid": "55cdba68-9f4e-4546-a257-0566e22694c0", 00:23:43.958 "assigned_rate_limits": { 00:23:43.958 "rw_ios_per_sec": 0, 00:23:43.958 "rw_mbytes_per_sec": 0, 00:23:43.958 "r_mbytes_per_sec": 0, 00:23:43.958 "w_mbytes_per_sec": 0 00:23:43.958 }, 00:23:43.958 "claimed": true, 00:23:43.958 "claim_type": "exclusive_write", 00:23:43.958 "zoned": false, 00:23:43.958 "supported_io_types": { 00:23:43.958 "read": true, 00:23:43.958 "write": true, 00:23:43.958 "unmap": true, 00:23:43.958 "flush": true, 00:23:43.958 "reset": true, 00:23:43.958 "nvme_admin": false, 00:23:43.958 "nvme_io": false, 00:23:43.958 "nvme_io_md": false, 00:23:43.958 "write_zeroes": true, 00:23:43.958 "zcopy": true, 00:23:43.958 "get_zone_info": false, 00:23:43.958 "zone_management": false, 00:23:43.958 "zone_append": false, 00:23:43.958 "compare": false, 00:23:43.958 "compare_and_write": false, 00:23:43.958 "abort": true, 00:23:43.958 "seek_hole": false, 00:23:43.958 "seek_data": false, 00:23:43.958 "copy": true, 00:23:43.958 "nvme_iov_md": false 00:23:43.958 }, 00:23:43.958 "memory_domains": [ 00:23:43.958 { 00:23:43.958 "dma_device_id": "system", 00:23:43.958 "dma_device_type": 1 00:23:43.959 }, 00:23:43.959 { 00:23:43.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:43.959 "dma_device_type": 2 00:23:43.959 } 00:23:43.959 ], 00:23:43.959 "driver_specific": {} 00:23:43.959 }' 00:23:43.959 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.959 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:43.959 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:43.959 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.959 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:43.959 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:43.959 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:43.959 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:44.217 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:44.217 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:44.217 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:44.217 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:44.217 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:44.217 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:44.217 05:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:44.475 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:44.475 "name": "BaseBdev4", 00:23:44.475 "aliases": [ 00:23:44.475 "ebf0f46e-c65c-49ec-a8ef-1a53913f830e" 00:23:44.475 ], 00:23:44.475 "product_name": "Malloc disk", 00:23:44.475 "block_size": 512, 00:23:44.475 "num_blocks": 65536, 00:23:44.475 "uuid": "ebf0f46e-c65c-49ec-a8ef-1a53913f830e", 00:23:44.475 "assigned_rate_limits": { 00:23:44.475 "rw_ios_per_sec": 0, 00:23:44.475 "rw_mbytes_per_sec": 0, 00:23:44.475 "r_mbytes_per_sec": 0, 00:23:44.475 "w_mbytes_per_sec": 0 00:23:44.475 }, 00:23:44.475 "claimed": true, 00:23:44.475 "claim_type": "exclusive_write", 00:23:44.475 "zoned": false, 00:23:44.475 "supported_io_types": { 00:23:44.475 "read": true, 00:23:44.475 "write": true, 00:23:44.475 "unmap": true, 00:23:44.475 "flush": true, 00:23:44.475 "reset": true, 00:23:44.475 "nvme_admin": false, 00:23:44.475 "nvme_io": false, 00:23:44.475 "nvme_io_md": false, 00:23:44.475 "write_zeroes": true, 00:23:44.475 "zcopy": true, 00:23:44.475 "get_zone_info": false, 00:23:44.475 "zone_management": false, 00:23:44.475 "zone_append": false, 00:23:44.475 "compare": false, 00:23:44.475 "compare_and_write": false, 00:23:44.475 "abort": true, 00:23:44.475 "seek_hole": false, 00:23:44.475 "seek_data": false, 00:23:44.475 "copy": true, 00:23:44.475 "nvme_iov_md": false 00:23:44.475 }, 00:23:44.475 "memory_domains": [ 00:23:44.475 { 00:23:44.475 "dma_device_id": "system", 00:23:44.475 "dma_device_type": 1 00:23:44.475 }, 00:23:44.475 { 00:23:44.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:44.475 "dma_device_type": 2 00:23:44.475 } 00:23:44.475 ], 00:23:44.475 "driver_specific": {} 00:23:44.475 }' 00:23:44.475 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:44.475 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:44.475 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:44.475 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:44.475 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:44.734 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:44.734 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:44.734 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:44.734 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:44.734 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:44.734 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:44.734 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:44.734 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:44.992 [2024-07-26 05:51:59.802710] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.992 05:51:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:45.252 05:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:45.252 "name": "Existed_Raid", 00:23:45.252 "uuid": "a408faf1-e327-4094-bd79-63060f7195eb", 00:23:45.252 "strip_size_kb": 0, 00:23:45.252 "state": "online", 00:23:45.252 "raid_level": "raid1", 00:23:45.252 "superblock": true, 00:23:45.252 "num_base_bdevs": 4, 00:23:45.252 "num_base_bdevs_discovered": 3, 00:23:45.252 "num_base_bdevs_operational": 3, 00:23:45.252 "base_bdevs_list": [ 00:23:45.252 { 00:23:45.252 "name": null, 00:23:45.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:45.252 "is_configured": false, 00:23:45.252 "data_offset": 2048, 00:23:45.252 "data_size": 63488 00:23:45.252 }, 00:23:45.252 { 00:23:45.252 "name": "BaseBdev2", 00:23:45.252 "uuid": "b6eb905c-4128-4c68-bef9-0b735fa623bc", 00:23:45.252 "is_configured": true, 00:23:45.252 "data_offset": 2048, 00:23:45.252 "data_size": 63488 00:23:45.252 }, 00:23:45.252 { 00:23:45.252 "name": "BaseBdev3", 00:23:45.252 "uuid": "55cdba68-9f4e-4546-a257-0566e22694c0", 00:23:45.252 "is_configured": true, 00:23:45.252 "data_offset": 2048, 00:23:45.252 "data_size": 63488 00:23:45.252 }, 00:23:45.252 { 00:23:45.253 "name": "BaseBdev4", 00:23:45.253 "uuid": "ebf0f46e-c65c-49ec-a8ef-1a53913f830e", 00:23:45.253 "is_configured": true, 00:23:45.253 "data_offset": 2048, 00:23:45.253 "data_size": 63488 00:23:45.253 } 00:23:45.253 ] 00:23:45.253 }' 00:23:45.253 05:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:45.253 05:52:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:45.821 05:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:45.821 05:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:45.821 05:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.822 05:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:46.082 05:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:46.082 05:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:46.082 05:52:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:46.340 [2024-07-26 05:52:01.112132] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:46.340 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:46.340 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:46.340 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.340 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:46.599 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:46.599 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:46.599 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:23:46.857 [2024-07-26 05:52:01.617913] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:46.857 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:46.857 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:46.857 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.857 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:47.114 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:47.114 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:47.114 05:52:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:23:47.373 [2024-07-26 05:52:02.109856] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:23:47.373 [2024-07-26 05:52:02.109938] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:47.373 [2024-07-26 05:52:02.122446] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:47.373 [2024-07-26 05:52:02.122481] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:47.373 [2024-07-26 05:52:02.122492] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25ec350 name Existed_Raid, state offline 00:23:47.373 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:47.373 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:47.373 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:47.373 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.640 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:47.640 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:47.640 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:23:47.640 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:23:47.640 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:47.640 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:47.902 BaseBdev2 00:23:47.902 05:52:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:23:47.902 05:52:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:23:47.902 05:52:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:47.902 05:52:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:47.902 05:52:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:47.902 05:52:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:47.902 05:52:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:48.160 05:52:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:48.417 [ 00:23:48.417 { 00:23:48.417 "name": "BaseBdev2", 00:23:48.417 "aliases": [ 00:23:48.417 "13a89f05-424a-4105-9973-a53aef4e44b7" 00:23:48.417 ], 00:23:48.417 "product_name": "Malloc disk", 00:23:48.417 "block_size": 512, 00:23:48.417 "num_blocks": 65536, 00:23:48.417 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:23:48.417 "assigned_rate_limits": { 00:23:48.417 "rw_ios_per_sec": 0, 00:23:48.417 "rw_mbytes_per_sec": 0, 00:23:48.417 "r_mbytes_per_sec": 0, 00:23:48.417 "w_mbytes_per_sec": 0 00:23:48.417 }, 00:23:48.417 "claimed": false, 00:23:48.417 "zoned": false, 00:23:48.417 "supported_io_types": { 00:23:48.417 "read": true, 00:23:48.417 "write": true, 00:23:48.417 "unmap": true, 00:23:48.417 "flush": true, 00:23:48.417 "reset": true, 00:23:48.417 "nvme_admin": false, 00:23:48.417 "nvme_io": false, 00:23:48.417 "nvme_io_md": false, 00:23:48.417 "write_zeroes": true, 00:23:48.417 "zcopy": true, 00:23:48.417 "get_zone_info": false, 00:23:48.417 "zone_management": false, 00:23:48.417 "zone_append": false, 00:23:48.417 "compare": false, 00:23:48.418 "compare_and_write": false, 00:23:48.418 "abort": true, 00:23:48.418 "seek_hole": false, 00:23:48.418 "seek_data": false, 00:23:48.418 "copy": true, 00:23:48.418 "nvme_iov_md": false 00:23:48.418 }, 00:23:48.418 "memory_domains": [ 00:23:48.418 { 00:23:48.418 "dma_device_id": "system", 00:23:48.418 "dma_device_type": 1 00:23:48.418 }, 00:23:48.418 { 00:23:48.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.418 "dma_device_type": 2 00:23:48.418 } 00:23:48.418 ], 00:23:48.418 "driver_specific": {} 00:23:48.418 } 00:23:48.418 ] 00:23:48.418 05:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:48.418 05:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:48.418 05:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:48.418 05:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:48.676 BaseBdev3 00:23:48.676 05:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:23:48.676 05:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:23:48.676 05:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:48.676 05:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:48.676 05:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:48.676 05:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:48.676 05:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:48.934 05:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:48.934 [ 00:23:48.934 { 00:23:48.934 "name": "BaseBdev3", 00:23:48.934 "aliases": [ 00:23:48.934 "d335d757-98cd-48c3-9f28-e18a76a8907b" 00:23:48.934 ], 00:23:48.934 "product_name": "Malloc disk", 00:23:48.934 "block_size": 512, 00:23:48.934 "num_blocks": 65536, 00:23:48.934 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:23:48.934 "assigned_rate_limits": { 00:23:48.934 "rw_ios_per_sec": 0, 00:23:48.934 "rw_mbytes_per_sec": 0, 00:23:48.934 "r_mbytes_per_sec": 0, 00:23:48.934 "w_mbytes_per_sec": 0 00:23:48.934 }, 00:23:48.934 "claimed": false, 00:23:48.934 "zoned": false, 00:23:48.934 "supported_io_types": { 00:23:48.934 "read": true, 00:23:48.934 "write": true, 00:23:48.934 "unmap": true, 00:23:48.934 "flush": true, 00:23:48.934 "reset": true, 00:23:48.934 "nvme_admin": false, 00:23:48.934 "nvme_io": false, 00:23:48.934 "nvme_io_md": false, 00:23:48.934 "write_zeroes": true, 00:23:48.934 "zcopy": true, 00:23:48.934 "get_zone_info": false, 00:23:48.934 "zone_management": false, 00:23:48.934 "zone_append": false, 00:23:48.934 "compare": false, 00:23:48.934 "compare_and_write": false, 00:23:48.934 "abort": true, 00:23:48.934 "seek_hole": false, 00:23:48.934 "seek_data": false, 00:23:48.934 "copy": true, 00:23:48.934 "nvme_iov_md": false 00:23:48.934 }, 00:23:48.934 "memory_domains": [ 00:23:48.934 { 00:23:48.934 "dma_device_id": "system", 00:23:48.934 "dma_device_type": 1 00:23:48.934 }, 00:23:48.934 { 00:23:48.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.934 "dma_device_type": 2 00:23:48.934 } 00:23:48.934 ], 00:23:48.934 "driver_specific": {} 00:23:48.934 } 00:23:48.934 ] 00:23:49.192 05:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:49.192 05:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:49.192 05:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:49.192 05:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:49.192 BaseBdev4 00:23:49.450 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:23:49.450 05:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:23:49.450 05:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:49.450 05:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:49.450 05:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:49.450 05:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:49.450 05:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:49.450 05:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:49.707 [ 00:23:49.707 { 00:23:49.707 "name": "BaseBdev4", 00:23:49.707 "aliases": [ 00:23:49.707 "ed4edfd3-0966-4523-8568-6b80f1121799" 00:23:49.707 ], 00:23:49.707 "product_name": "Malloc disk", 00:23:49.707 "block_size": 512, 00:23:49.707 "num_blocks": 65536, 00:23:49.707 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:23:49.707 "assigned_rate_limits": { 00:23:49.707 "rw_ios_per_sec": 0, 00:23:49.707 "rw_mbytes_per_sec": 0, 00:23:49.707 "r_mbytes_per_sec": 0, 00:23:49.707 "w_mbytes_per_sec": 0 00:23:49.707 }, 00:23:49.707 "claimed": false, 00:23:49.707 "zoned": false, 00:23:49.707 "supported_io_types": { 00:23:49.707 "read": true, 00:23:49.707 "write": true, 00:23:49.707 "unmap": true, 00:23:49.707 "flush": true, 00:23:49.707 "reset": true, 00:23:49.707 "nvme_admin": false, 00:23:49.707 "nvme_io": false, 00:23:49.707 "nvme_io_md": false, 00:23:49.707 "write_zeroes": true, 00:23:49.707 "zcopy": true, 00:23:49.707 "get_zone_info": false, 00:23:49.707 "zone_management": false, 00:23:49.707 "zone_append": false, 00:23:49.707 "compare": false, 00:23:49.707 "compare_and_write": false, 00:23:49.707 "abort": true, 00:23:49.707 "seek_hole": false, 00:23:49.707 "seek_data": false, 00:23:49.707 "copy": true, 00:23:49.707 "nvme_iov_md": false 00:23:49.707 }, 00:23:49.707 "memory_domains": [ 00:23:49.707 { 00:23:49.707 "dma_device_id": "system", 00:23:49.707 "dma_device_type": 1 00:23:49.707 }, 00:23:49.707 { 00:23:49.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:49.707 "dma_device_type": 2 00:23:49.707 } 00:23:49.707 ], 00:23:49.707 "driver_specific": {} 00:23:49.707 } 00:23:49.707 ] 00:23:49.707 05:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:49.707 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:49.707 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:49.707 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:49.965 [2024-07-26 05:52:04.807834] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:49.965 [2024-07-26 05:52:04.807872] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:49.965 [2024-07-26 05:52:04.807890] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:49.965 [2024-07-26 05:52:04.809235] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:49.965 [2024-07-26 05:52:04.809276] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.965 05:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:50.223 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:50.223 "name": "Existed_Raid", 00:23:50.223 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:23:50.223 "strip_size_kb": 0, 00:23:50.223 "state": "configuring", 00:23:50.223 "raid_level": "raid1", 00:23:50.223 "superblock": true, 00:23:50.223 "num_base_bdevs": 4, 00:23:50.223 "num_base_bdevs_discovered": 3, 00:23:50.223 "num_base_bdevs_operational": 4, 00:23:50.223 "base_bdevs_list": [ 00:23:50.223 { 00:23:50.223 "name": "BaseBdev1", 00:23:50.223 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.223 "is_configured": false, 00:23:50.223 "data_offset": 0, 00:23:50.223 "data_size": 0 00:23:50.223 }, 00:23:50.223 { 00:23:50.223 "name": "BaseBdev2", 00:23:50.223 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:23:50.223 "is_configured": true, 00:23:50.223 "data_offset": 2048, 00:23:50.223 "data_size": 63488 00:23:50.223 }, 00:23:50.223 { 00:23:50.223 "name": "BaseBdev3", 00:23:50.223 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:23:50.223 "is_configured": true, 00:23:50.223 "data_offset": 2048, 00:23:50.223 "data_size": 63488 00:23:50.223 }, 00:23:50.223 { 00:23:50.223 "name": "BaseBdev4", 00:23:50.223 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:23:50.223 "is_configured": true, 00:23:50.223 "data_offset": 2048, 00:23:50.223 "data_size": 63488 00:23:50.223 } 00:23:50.223 ] 00:23:50.223 }' 00:23:50.223 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:50.223 05:52:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:50.788 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:51.048 [2024-07-26 05:52:05.870609] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.048 05:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:51.346 05:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:51.346 "name": "Existed_Raid", 00:23:51.346 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:23:51.346 "strip_size_kb": 0, 00:23:51.346 "state": "configuring", 00:23:51.346 "raid_level": "raid1", 00:23:51.346 "superblock": true, 00:23:51.346 "num_base_bdevs": 4, 00:23:51.346 "num_base_bdevs_discovered": 2, 00:23:51.346 "num_base_bdevs_operational": 4, 00:23:51.346 "base_bdevs_list": [ 00:23:51.346 { 00:23:51.346 "name": "BaseBdev1", 00:23:51.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:51.346 "is_configured": false, 00:23:51.346 "data_offset": 0, 00:23:51.346 "data_size": 0 00:23:51.346 }, 00:23:51.346 { 00:23:51.346 "name": null, 00:23:51.346 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:23:51.346 "is_configured": false, 00:23:51.346 "data_offset": 2048, 00:23:51.346 "data_size": 63488 00:23:51.346 }, 00:23:51.346 { 00:23:51.346 "name": "BaseBdev3", 00:23:51.346 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:23:51.346 "is_configured": true, 00:23:51.346 "data_offset": 2048, 00:23:51.346 "data_size": 63488 00:23:51.346 }, 00:23:51.346 { 00:23:51.346 "name": "BaseBdev4", 00:23:51.346 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:23:51.346 "is_configured": true, 00:23:51.346 "data_offset": 2048, 00:23:51.346 "data_size": 63488 00:23:51.346 } 00:23:51.346 ] 00:23:51.346 }' 00:23:51.346 05:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:51.346 05:52:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:51.926 05:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.926 05:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:52.183 05:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:23:52.183 05:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:52.440 [2024-07-26 05:52:07.217501] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:52.440 BaseBdev1 00:23:52.440 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:23:52.440 05:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:23:52.440 05:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:52.440 05:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:52.440 05:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:52.440 05:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:52.440 05:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:52.697 05:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:52.954 [ 00:23:52.954 { 00:23:52.954 "name": "BaseBdev1", 00:23:52.954 "aliases": [ 00:23:52.954 "0e16982e-53d1-4082-8a0c-805df5a62da0" 00:23:52.954 ], 00:23:52.954 "product_name": "Malloc disk", 00:23:52.954 "block_size": 512, 00:23:52.954 "num_blocks": 65536, 00:23:52.954 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:23:52.954 "assigned_rate_limits": { 00:23:52.954 "rw_ios_per_sec": 0, 00:23:52.954 "rw_mbytes_per_sec": 0, 00:23:52.954 "r_mbytes_per_sec": 0, 00:23:52.954 "w_mbytes_per_sec": 0 00:23:52.954 }, 00:23:52.954 "claimed": true, 00:23:52.954 "claim_type": "exclusive_write", 00:23:52.954 "zoned": false, 00:23:52.954 "supported_io_types": { 00:23:52.954 "read": true, 00:23:52.954 "write": true, 00:23:52.954 "unmap": true, 00:23:52.954 "flush": true, 00:23:52.954 "reset": true, 00:23:52.954 "nvme_admin": false, 00:23:52.954 "nvme_io": false, 00:23:52.954 "nvme_io_md": false, 00:23:52.954 "write_zeroes": true, 00:23:52.954 "zcopy": true, 00:23:52.954 "get_zone_info": false, 00:23:52.954 "zone_management": false, 00:23:52.954 "zone_append": false, 00:23:52.954 "compare": false, 00:23:52.954 "compare_and_write": false, 00:23:52.954 "abort": true, 00:23:52.954 "seek_hole": false, 00:23:52.954 "seek_data": false, 00:23:52.954 "copy": true, 00:23:52.954 "nvme_iov_md": false 00:23:52.954 }, 00:23:52.954 "memory_domains": [ 00:23:52.954 { 00:23:52.954 "dma_device_id": "system", 00:23:52.954 "dma_device_type": 1 00:23:52.954 }, 00:23:52.954 { 00:23:52.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:52.954 "dma_device_type": 2 00:23:52.954 } 00:23:52.954 ], 00:23:52.954 "driver_specific": {} 00:23:52.954 } 00:23:52.954 ] 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.954 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:53.212 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.212 "name": "Existed_Raid", 00:23:53.212 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:23:53.212 "strip_size_kb": 0, 00:23:53.212 "state": "configuring", 00:23:53.212 "raid_level": "raid1", 00:23:53.212 "superblock": true, 00:23:53.212 "num_base_bdevs": 4, 00:23:53.212 "num_base_bdevs_discovered": 3, 00:23:53.212 "num_base_bdevs_operational": 4, 00:23:53.212 "base_bdevs_list": [ 00:23:53.212 { 00:23:53.212 "name": "BaseBdev1", 00:23:53.212 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:23:53.212 "is_configured": true, 00:23:53.212 "data_offset": 2048, 00:23:53.212 "data_size": 63488 00:23:53.212 }, 00:23:53.212 { 00:23:53.212 "name": null, 00:23:53.212 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:23:53.212 "is_configured": false, 00:23:53.212 "data_offset": 2048, 00:23:53.212 "data_size": 63488 00:23:53.212 }, 00:23:53.212 { 00:23:53.212 "name": "BaseBdev3", 00:23:53.212 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:23:53.212 "is_configured": true, 00:23:53.212 "data_offset": 2048, 00:23:53.212 "data_size": 63488 00:23:53.212 }, 00:23:53.212 { 00:23:53.212 "name": "BaseBdev4", 00:23:53.212 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:23:53.212 "is_configured": true, 00:23:53.212 "data_offset": 2048, 00:23:53.212 "data_size": 63488 00:23:53.212 } 00:23:53.212 ] 00:23:53.212 }' 00:23:53.212 05:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.212 05:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:53.777 05:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.777 05:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:54.035 05:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:23:54.035 05:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:23:54.293 [2024-07-26 05:52:09.070434] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.293 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:54.551 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:54.551 "name": "Existed_Raid", 00:23:54.551 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:23:54.551 "strip_size_kb": 0, 00:23:54.551 "state": "configuring", 00:23:54.551 "raid_level": "raid1", 00:23:54.551 "superblock": true, 00:23:54.551 "num_base_bdevs": 4, 00:23:54.551 "num_base_bdevs_discovered": 2, 00:23:54.551 "num_base_bdevs_operational": 4, 00:23:54.551 "base_bdevs_list": [ 00:23:54.551 { 00:23:54.551 "name": "BaseBdev1", 00:23:54.551 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:23:54.551 "is_configured": true, 00:23:54.551 "data_offset": 2048, 00:23:54.551 "data_size": 63488 00:23:54.551 }, 00:23:54.551 { 00:23:54.551 "name": null, 00:23:54.551 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:23:54.551 "is_configured": false, 00:23:54.551 "data_offset": 2048, 00:23:54.551 "data_size": 63488 00:23:54.551 }, 00:23:54.551 { 00:23:54.551 "name": null, 00:23:54.551 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:23:54.551 "is_configured": false, 00:23:54.551 "data_offset": 2048, 00:23:54.551 "data_size": 63488 00:23:54.551 }, 00:23:54.551 { 00:23:54.551 "name": "BaseBdev4", 00:23:54.551 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:23:54.551 "is_configured": true, 00:23:54.551 "data_offset": 2048, 00:23:54.551 "data_size": 63488 00:23:54.551 } 00:23:54.551 ] 00:23:54.551 }' 00:23:54.551 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:54.551 05:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:55.118 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.118 05:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:55.376 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:23:55.376 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:23:55.635 [2024-07-26 05:52:10.414021] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.635 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:55.894 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.894 "name": "Existed_Raid", 00:23:55.894 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:23:55.894 "strip_size_kb": 0, 00:23:55.894 "state": "configuring", 00:23:55.894 "raid_level": "raid1", 00:23:55.894 "superblock": true, 00:23:55.894 "num_base_bdevs": 4, 00:23:55.894 "num_base_bdevs_discovered": 3, 00:23:55.894 "num_base_bdevs_operational": 4, 00:23:55.894 "base_bdevs_list": [ 00:23:55.894 { 00:23:55.894 "name": "BaseBdev1", 00:23:55.894 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:23:55.894 "is_configured": true, 00:23:55.894 "data_offset": 2048, 00:23:55.894 "data_size": 63488 00:23:55.894 }, 00:23:55.894 { 00:23:55.894 "name": null, 00:23:55.894 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:23:55.894 "is_configured": false, 00:23:55.894 "data_offset": 2048, 00:23:55.894 "data_size": 63488 00:23:55.894 }, 00:23:55.894 { 00:23:55.894 "name": "BaseBdev3", 00:23:55.894 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:23:55.894 "is_configured": true, 00:23:55.894 "data_offset": 2048, 00:23:55.894 "data_size": 63488 00:23:55.894 }, 00:23:55.894 { 00:23:55.894 "name": "BaseBdev4", 00:23:55.894 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:23:55.894 "is_configured": true, 00:23:55.894 "data_offset": 2048, 00:23:55.894 "data_size": 63488 00:23:55.894 } 00:23:55.894 ] 00:23:55.894 }' 00:23:55.894 05:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.894 05:52:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:56.460 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.460 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:56.715 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:23:56.715 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:56.970 [2024-07-26 05:52:11.681409] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.970 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:57.226 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:57.226 "name": "Existed_Raid", 00:23:57.226 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:23:57.226 "strip_size_kb": 0, 00:23:57.226 "state": "configuring", 00:23:57.226 "raid_level": "raid1", 00:23:57.226 "superblock": true, 00:23:57.226 "num_base_bdevs": 4, 00:23:57.226 "num_base_bdevs_discovered": 2, 00:23:57.226 "num_base_bdevs_operational": 4, 00:23:57.226 "base_bdevs_list": [ 00:23:57.226 { 00:23:57.226 "name": null, 00:23:57.226 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:23:57.226 "is_configured": false, 00:23:57.226 "data_offset": 2048, 00:23:57.226 "data_size": 63488 00:23:57.226 }, 00:23:57.226 { 00:23:57.226 "name": null, 00:23:57.226 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:23:57.226 "is_configured": false, 00:23:57.226 "data_offset": 2048, 00:23:57.226 "data_size": 63488 00:23:57.226 }, 00:23:57.226 { 00:23:57.226 "name": "BaseBdev3", 00:23:57.226 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:23:57.226 "is_configured": true, 00:23:57.226 "data_offset": 2048, 00:23:57.226 "data_size": 63488 00:23:57.226 }, 00:23:57.226 { 00:23:57.226 "name": "BaseBdev4", 00:23:57.226 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:23:57.226 "is_configured": true, 00:23:57.226 "data_offset": 2048, 00:23:57.226 "data_size": 63488 00:23:57.226 } 00:23:57.226 ] 00:23:57.226 }' 00:23:57.226 05:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:57.226 05:52:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:57.789 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.789 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:23:58.047 [2024-07-26 05:52:12.883198] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.047 05:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:58.306 05:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:58.306 "name": "Existed_Raid", 00:23:58.306 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:23:58.306 "strip_size_kb": 0, 00:23:58.306 "state": "configuring", 00:23:58.306 "raid_level": "raid1", 00:23:58.306 "superblock": true, 00:23:58.306 "num_base_bdevs": 4, 00:23:58.306 "num_base_bdevs_discovered": 3, 00:23:58.306 "num_base_bdevs_operational": 4, 00:23:58.306 "base_bdevs_list": [ 00:23:58.306 { 00:23:58.306 "name": null, 00:23:58.306 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:23:58.306 "is_configured": false, 00:23:58.306 "data_offset": 2048, 00:23:58.306 "data_size": 63488 00:23:58.306 }, 00:23:58.306 { 00:23:58.306 "name": "BaseBdev2", 00:23:58.306 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:23:58.306 "is_configured": true, 00:23:58.306 "data_offset": 2048, 00:23:58.306 "data_size": 63488 00:23:58.306 }, 00:23:58.306 { 00:23:58.306 "name": "BaseBdev3", 00:23:58.306 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:23:58.306 "is_configured": true, 00:23:58.306 "data_offset": 2048, 00:23:58.306 "data_size": 63488 00:23:58.306 }, 00:23:58.306 { 00:23:58.306 "name": "BaseBdev4", 00:23:58.306 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:23:58.306 "is_configured": true, 00:23:58.306 "data_offset": 2048, 00:23:58.306 "data_size": 63488 00:23:58.306 } 00:23:58.306 ] 00:23:58.306 }' 00:23:58.306 05:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:58.306 05:52:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:58.871 05:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.871 05:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:59.129 05:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:23:59.129 05:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.129 05:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:23:59.386 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 0e16982e-53d1-4082-8a0c-805df5a62da0 00:23:59.644 [2024-07-26 05:52:14.414707] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:23:59.644 [2024-07-26 05:52:14.414862] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x25ee180 00:23:59.644 [2024-07-26 05:52:14.414876] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:59.644 [2024-07-26 05:52:14.415051] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25eec20 00:23:59.644 [2024-07-26 05:52:14.415181] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25ee180 00:23:59.644 [2024-07-26 05:52:14.415193] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25ee180 00:23:59.644 [2024-07-26 05:52:14.415288] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:59.644 NewBaseBdev 00:23:59.644 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:23:59.644 05:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:23:59.644 05:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:59.644 05:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:59.644 05:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:59.644 05:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:59.644 05:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:59.902 05:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:24:00.159 [ 00:24:00.159 { 00:24:00.159 "name": "NewBaseBdev", 00:24:00.159 "aliases": [ 00:24:00.159 "0e16982e-53d1-4082-8a0c-805df5a62da0" 00:24:00.159 ], 00:24:00.159 "product_name": "Malloc disk", 00:24:00.159 "block_size": 512, 00:24:00.159 "num_blocks": 65536, 00:24:00.159 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:24:00.159 "assigned_rate_limits": { 00:24:00.159 "rw_ios_per_sec": 0, 00:24:00.159 "rw_mbytes_per_sec": 0, 00:24:00.159 "r_mbytes_per_sec": 0, 00:24:00.159 "w_mbytes_per_sec": 0 00:24:00.159 }, 00:24:00.159 "claimed": true, 00:24:00.159 "claim_type": "exclusive_write", 00:24:00.159 "zoned": false, 00:24:00.159 "supported_io_types": { 00:24:00.159 "read": true, 00:24:00.159 "write": true, 00:24:00.159 "unmap": true, 00:24:00.159 "flush": true, 00:24:00.159 "reset": true, 00:24:00.159 "nvme_admin": false, 00:24:00.159 "nvme_io": false, 00:24:00.159 "nvme_io_md": false, 00:24:00.159 "write_zeroes": true, 00:24:00.159 "zcopy": true, 00:24:00.159 "get_zone_info": false, 00:24:00.159 "zone_management": false, 00:24:00.159 "zone_append": false, 00:24:00.159 "compare": false, 00:24:00.159 "compare_and_write": false, 00:24:00.159 "abort": true, 00:24:00.159 "seek_hole": false, 00:24:00.159 "seek_data": false, 00:24:00.159 "copy": true, 00:24:00.159 "nvme_iov_md": false 00:24:00.159 }, 00:24:00.159 "memory_domains": [ 00:24:00.159 { 00:24:00.159 "dma_device_id": "system", 00:24:00.159 "dma_device_type": 1 00:24:00.159 }, 00:24:00.159 { 00:24:00.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:00.159 "dma_device_type": 2 00:24:00.159 } 00:24:00.159 ], 00:24:00.159 "driver_specific": {} 00:24:00.159 } 00:24:00.159 ] 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.159 05:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:24:00.417 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:00.417 "name": "Existed_Raid", 00:24:00.417 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:24:00.417 "strip_size_kb": 0, 00:24:00.417 "state": "online", 00:24:00.417 "raid_level": "raid1", 00:24:00.417 "superblock": true, 00:24:00.417 "num_base_bdevs": 4, 00:24:00.417 "num_base_bdevs_discovered": 4, 00:24:00.417 "num_base_bdevs_operational": 4, 00:24:00.417 "base_bdevs_list": [ 00:24:00.417 { 00:24:00.417 "name": "NewBaseBdev", 00:24:00.417 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:24:00.417 "is_configured": true, 00:24:00.417 "data_offset": 2048, 00:24:00.417 "data_size": 63488 00:24:00.417 }, 00:24:00.417 { 00:24:00.417 "name": "BaseBdev2", 00:24:00.417 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:24:00.417 "is_configured": true, 00:24:00.417 "data_offset": 2048, 00:24:00.417 "data_size": 63488 00:24:00.417 }, 00:24:00.417 { 00:24:00.417 "name": "BaseBdev3", 00:24:00.417 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:24:00.417 "is_configured": true, 00:24:00.417 "data_offset": 2048, 00:24:00.417 "data_size": 63488 00:24:00.417 }, 00:24:00.417 { 00:24:00.417 "name": "BaseBdev4", 00:24:00.417 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:24:00.417 "is_configured": true, 00:24:00.417 "data_offset": 2048, 00:24:00.417 "data_size": 63488 00:24:00.417 } 00:24:00.417 ] 00:24:00.417 }' 00:24:00.417 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:00.417 05:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:00.980 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:24:00.980 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:24:00.980 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:00.980 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:00.980 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:00.980 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:24:00.980 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:24:00.980 05:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:01.237 [2024-07-26 05:52:15.999252] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:01.237 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:01.237 "name": "Existed_Raid", 00:24:01.237 "aliases": [ 00:24:01.237 "4626200a-2f2d-4f1c-91d4-8f14db27a92a" 00:24:01.237 ], 00:24:01.237 "product_name": "Raid Volume", 00:24:01.237 "block_size": 512, 00:24:01.237 "num_blocks": 63488, 00:24:01.237 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:24:01.237 "assigned_rate_limits": { 00:24:01.237 "rw_ios_per_sec": 0, 00:24:01.237 "rw_mbytes_per_sec": 0, 00:24:01.237 "r_mbytes_per_sec": 0, 00:24:01.237 "w_mbytes_per_sec": 0 00:24:01.237 }, 00:24:01.238 "claimed": false, 00:24:01.238 "zoned": false, 00:24:01.238 "supported_io_types": { 00:24:01.238 "read": true, 00:24:01.238 "write": true, 00:24:01.238 "unmap": false, 00:24:01.238 "flush": false, 00:24:01.238 "reset": true, 00:24:01.238 "nvme_admin": false, 00:24:01.238 "nvme_io": false, 00:24:01.238 "nvme_io_md": false, 00:24:01.238 "write_zeroes": true, 00:24:01.238 "zcopy": false, 00:24:01.238 "get_zone_info": false, 00:24:01.238 "zone_management": false, 00:24:01.238 "zone_append": false, 00:24:01.238 "compare": false, 00:24:01.238 "compare_and_write": false, 00:24:01.238 "abort": false, 00:24:01.238 "seek_hole": false, 00:24:01.238 "seek_data": false, 00:24:01.238 "copy": false, 00:24:01.238 "nvme_iov_md": false 00:24:01.238 }, 00:24:01.238 "memory_domains": [ 00:24:01.238 { 00:24:01.238 "dma_device_id": "system", 00:24:01.238 "dma_device_type": 1 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:01.238 "dma_device_type": 2 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "dma_device_id": "system", 00:24:01.238 "dma_device_type": 1 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:01.238 "dma_device_type": 2 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "dma_device_id": "system", 00:24:01.238 "dma_device_type": 1 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:01.238 "dma_device_type": 2 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "dma_device_id": "system", 00:24:01.238 "dma_device_type": 1 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:01.238 "dma_device_type": 2 00:24:01.238 } 00:24:01.238 ], 00:24:01.238 "driver_specific": { 00:24:01.238 "raid": { 00:24:01.238 "uuid": "4626200a-2f2d-4f1c-91d4-8f14db27a92a", 00:24:01.238 "strip_size_kb": 0, 00:24:01.238 "state": "online", 00:24:01.238 "raid_level": "raid1", 00:24:01.238 "superblock": true, 00:24:01.238 "num_base_bdevs": 4, 00:24:01.238 "num_base_bdevs_discovered": 4, 00:24:01.238 "num_base_bdevs_operational": 4, 00:24:01.238 "base_bdevs_list": [ 00:24:01.238 { 00:24:01.238 "name": "NewBaseBdev", 00:24:01.238 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:24:01.238 "is_configured": true, 00:24:01.238 "data_offset": 2048, 00:24:01.238 "data_size": 63488 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "name": "BaseBdev2", 00:24:01.238 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:24:01.238 "is_configured": true, 00:24:01.238 "data_offset": 2048, 00:24:01.238 "data_size": 63488 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "name": "BaseBdev3", 00:24:01.238 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:24:01.238 "is_configured": true, 00:24:01.238 "data_offset": 2048, 00:24:01.238 "data_size": 63488 00:24:01.238 }, 00:24:01.238 { 00:24:01.238 "name": "BaseBdev4", 00:24:01.238 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:24:01.238 "is_configured": true, 00:24:01.238 "data_offset": 2048, 00:24:01.238 "data_size": 63488 00:24:01.238 } 00:24:01.238 ] 00:24:01.238 } 00:24:01.238 } 00:24:01.238 }' 00:24:01.238 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:01.238 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:24:01.238 BaseBdev2 00:24:01.238 BaseBdev3 00:24:01.238 BaseBdev4' 00:24:01.238 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:01.238 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:24:01.238 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:01.495 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:01.495 "name": "NewBaseBdev", 00:24:01.495 "aliases": [ 00:24:01.495 "0e16982e-53d1-4082-8a0c-805df5a62da0" 00:24:01.495 ], 00:24:01.495 "product_name": "Malloc disk", 00:24:01.495 "block_size": 512, 00:24:01.495 "num_blocks": 65536, 00:24:01.495 "uuid": "0e16982e-53d1-4082-8a0c-805df5a62da0", 00:24:01.495 "assigned_rate_limits": { 00:24:01.495 "rw_ios_per_sec": 0, 00:24:01.495 "rw_mbytes_per_sec": 0, 00:24:01.495 "r_mbytes_per_sec": 0, 00:24:01.495 "w_mbytes_per_sec": 0 00:24:01.495 }, 00:24:01.495 "claimed": true, 00:24:01.495 "claim_type": "exclusive_write", 00:24:01.495 "zoned": false, 00:24:01.495 "supported_io_types": { 00:24:01.495 "read": true, 00:24:01.495 "write": true, 00:24:01.495 "unmap": true, 00:24:01.495 "flush": true, 00:24:01.495 "reset": true, 00:24:01.495 "nvme_admin": false, 00:24:01.495 "nvme_io": false, 00:24:01.495 "nvme_io_md": false, 00:24:01.495 "write_zeroes": true, 00:24:01.495 "zcopy": true, 00:24:01.495 "get_zone_info": false, 00:24:01.495 "zone_management": false, 00:24:01.495 "zone_append": false, 00:24:01.495 "compare": false, 00:24:01.495 "compare_and_write": false, 00:24:01.495 "abort": true, 00:24:01.495 "seek_hole": false, 00:24:01.495 "seek_data": false, 00:24:01.495 "copy": true, 00:24:01.495 "nvme_iov_md": false 00:24:01.495 }, 00:24:01.495 "memory_domains": [ 00:24:01.495 { 00:24:01.495 "dma_device_id": "system", 00:24:01.495 "dma_device_type": 1 00:24:01.495 }, 00:24:01.495 { 00:24:01.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:01.496 "dma_device_type": 2 00:24:01.496 } 00:24:01.496 ], 00:24:01.496 "driver_specific": {} 00:24:01.496 }' 00:24:01.496 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:01.496 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:01.752 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:01.752 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:01.752 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:01.752 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:01.752 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:01.752 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:01.752 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:01.752 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:01.752 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:02.009 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:02.009 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:02.009 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:02.009 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:24:02.266 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:02.266 "name": "BaseBdev2", 00:24:02.266 "aliases": [ 00:24:02.266 "13a89f05-424a-4105-9973-a53aef4e44b7" 00:24:02.266 ], 00:24:02.267 "product_name": "Malloc disk", 00:24:02.267 "block_size": 512, 00:24:02.267 "num_blocks": 65536, 00:24:02.267 "uuid": "13a89f05-424a-4105-9973-a53aef4e44b7", 00:24:02.267 "assigned_rate_limits": { 00:24:02.267 "rw_ios_per_sec": 0, 00:24:02.267 "rw_mbytes_per_sec": 0, 00:24:02.267 "r_mbytes_per_sec": 0, 00:24:02.267 "w_mbytes_per_sec": 0 00:24:02.267 }, 00:24:02.267 "claimed": true, 00:24:02.267 "claim_type": "exclusive_write", 00:24:02.267 "zoned": false, 00:24:02.267 "supported_io_types": { 00:24:02.267 "read": true, 00:24:02.267 "write": true, 00:24:02.267 "unmap": true, 00:24:02.267 "flush": true, 00:24:02.267 "reset": true, 00:24:02.267 "nvme_admin": false, 00:24:02.267 "nvme_io": false, 00:24:02.267 "nvme_io_md": false, 00:24:02.267 "write_zeroes": true, 00:24:02.267 "zcopy": true, 00:24:02.267 "get_zone_info": false, 00:24:02.267 "zone_management": false, 00:24:02.267 "zone_append": false, 00:24:02.267 "compare": false, 00:24:02.267 "compare_and_write": false, 00:24:02.267 "abort": true, 00:24:02.267 "seek_hole": false, 00:24:02.267 "seek_data": false, 00:24:02.267 "copy": true, 00:24:02.267 "nvme_iov_md": false 00:24:02.267 }, 00:24:02.267 "memory_domains": [ 00:24:02.267 { 00:24:02.267 "dma_device_id": "system", 00:24:02.267 "dma_device_type": 1 00:24:02.267 }, 00:24:02.267 { 00:24:02.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:02.267 "dma_device_type": 2 00:24:02.267 } 00:24:02.267 ], 00:24:02.267 "driver_specific": {} 00:24:02.267 }' 00:24:02.267 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:02.267 05:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:02.267 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:02.267 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:02.267 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:02.267 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:02.267 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:02.267 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:02.524 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:02.524 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:02.524 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:02.524 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:02.524 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:02.524 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:24:02.524 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:02.781 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:02.782 "name": "BaseBdev3", 00:24:02.782 "aliases": [ 00:24:02.782 "d335d757-98cd-48c3-9f28-e18a76a8907b" 00:24:02.782 ], 00:24:02.782 "product_name": "Malloc disk", 00:24:02.782 "block_size": 512, 00:24:02.782 "num_blocks": 65536, 00:24:02.782 "uuid": "d335d757-98cd-48c3-9f28-e18a76a8907b", 00:24:02.782 "assigned_rate_limits": { 00:24:02.782 "rw_ios_per_sec": 0, 00:24:02.782 "rw_mbytes_per_sec": 0, 00:24:02.782 "r_mbytes_per_sec": 0, 00:24:02.782 "w_mbytes_per_sec": 0 00:24:02.782 }, 00:24:02.782 "claimed": true, 00:24:02.782 "claim_type": "exclusive_write", 00:24:02.782 "zoned": false, 00:24:02.782 "supported_io_types": { 00:24:02.782 "read": true, 00:24:02.782 "write": true, 00:24:02.782 "unmap": true, 00:24:02.782 "flush": true, 00:24:02.782 "reset": true, 00:24:02.782 "nvme_admin": false, 00:24:02.782 "nvme_io": false, 00:24:02.782 "nvme_io_md": false, 00:24:02.782 "write_zeroes": true, 00:24:02.782 "zcopy": true, 00:24:02.782 "get_zone_info": false, 00:24:02.782 "zone_management": false, 00:24:02.782 "zone_append": false, 00:24:02.782 "compare": false, 00:24:02.782 "compare_and_write": false, 00:24:02.782 "abort": true, 00:24:02.782 "seek_hole": false, 00:24:02.782 "seek_data": false, 00:24:02.782 "copy": true, 00:24:02.782 "nvme_iov_md": false 00:24:02.782 }, 00:24:02.782 "memory_domains": [ 00:24:02.782 { 00:24:02.782 "dma_device_id": "system", 00:24:02.782 "dma_device_type": 1 00:24:02.782 }, 00:24:02.782 { 00:24:02.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:02.782 "dma_device_type": 2 00:24:02.782 } 00:24:02.782 ], 00:24:02.782 "driver_specific": {} 00:24:02.782 }' 00:24:02.782 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:02.782 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:02.782 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:02.782 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:02.782 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:24:03.039 05:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:03.604 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:03.604 "name": "BaseBdev4", 00:24:03.604 "aliases": [ 00:24:03.604 "ed4edfd3-0966-4523-8568-6b80f1121799" 00:24:03.604 ], 00:24:03.604 "product_name": "Malloc disk", 00:24:03.604 "block_size": 512, 00:24:03.604 "num_blocks": 65536, 00:24:03.604 "uuid": "ed4edfd3-0966-4523-8568-6b80f1121799", 00:24:03.604 "assigned_rate_limits": { 00:24:03.604 "rw_ios_per_sec": 0, 00:24:03.604 "rw_mbytes_per_sec": 0, 00:24:03.604 "r_mbytes_per_sec": 0, 00:24:03.604 "w_mbytes_per_sec": 0 00:24:03.604 }, 00:24:03.604 "claimed": true, 00:24:03.604 "claim_type": "exclusive_write", 00:24:03.604 "zoned": false, 00:24:03.604 "supported_io_types": { 00:24:03.604 "read": true, 00:24:03.604 "write": true, 00:24:03.604 "unmap": true, 00:24:03.604 "flush": true, 00:24:03.604 "reset": true, 00:24:03.604 "nvme_admin": false, 00:24:03.604 "nvme_io": false, 00:24:03.604 "nvme_io_md": false, 00:24:03.604 "write_zeroes": true, 00:24:03.604 "zcopy": true, 00:24:03.604 "get_zone_info": false, 00:24:03.604 "zone_management": false, 00:24:03.604 "zone_append": false, 00:24:03.604 "compare": false, 00:24:03.604 "compare_and_write": false, 00:24:03.604 "abort": true, 00:24:03.604 "seek_hole": false, 00:24:03.604 "seek_data": false, 00:24:03.604 "copy": true, 00:24:03.604 "nvme_iov_md": false 00:24:03.604 }, 00:24:03.604 "memory_domains": [ 00:24:03.604 { 00:24:03.604 "dma_device_id": "system", 00:24:03.604 "dma_device_type": 1 00:24:03.604 }, 00:24:03.604 { 00:24:03.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:03.604 "dma_device_type": 2 00:24:03.604 } 00:24:03.604 ], 00:24:03.604 "driver_specific": {} 00:24:03.604 }' 00:24:03.604 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:03.604 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:03.604 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:03.604 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:03.860 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:03.860 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:03.860 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:03.860 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:03.860 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:03.860 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:03.860 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:03.860 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:03.861 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:24:04.118 [2024-07-26 05:52:18.850496] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:24:04.118 [2024-07-26 05:52:18.850523] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:04.118 [2024-07-26 05:52:18.850581] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:04.118 [2024-07-26 05:52:18.850876] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:04.118 [2024-07-26 05:52:18.850889] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25ee180 name Existed_Raid, state offline 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1222946 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1222946 ']' 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1222946 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1222946 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1222946' 00:24:04.118 killing process with pid 1222946 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1222946 00:24:04.118 [2024-07-26 05:52:18.912997] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:04.118 05:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1222946 00:24:04.118 [2024-07-26 05:52:18.955568] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:04.375 05:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:24:04.375 00:24:04.375 real 0m31.893s 00:24:04.375 user 0m58.453s 00:24:04.375 sys 0m5.827s 00:24:04.375 05:52:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:04.375 05:52:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:04.375 ************************************ 00:24:04.375 END TEST raid_state_function_test_sb 00:24:04.375 ************************************ 00:24:04.375 05:52:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:04.375 05:52:19 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:24:04.375 05:52:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:24:04.375 05:52:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:04.375 05:52:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:04.375 ************************************ 00:24:04.375 START TEST raid_superblock_test 00:24:04.375 ************************************ 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1227689 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1227689 /var/tmp/spdk-raid.sock 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1227689 ']' 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:04.375 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:04.375 05:52:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:04.641 [2024-07-26 05:52:19.315950] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:24:04.641 [2024-07-26 05:52:19.316012] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1227689 ] 00:24:04.641 [2024-07-26 05:52:19.445494] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:04.641 [2024-07-26 05:52:19.547793] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:04.915 [2024-07-26 05:52:19.607493] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:04.915 [2024-07-26 05:52:19.607531] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:05.487 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:24:05.745 malloc1 00:24:05.745 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:06.003 [2024-07-26 05:52:20.736283] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:06.003 [2024-07-26 05:52:20.736330] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:06.003 [2024-07-26 05:52:20.736352] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238f570 00:24:06.003 [2024-07-26 05:52:20.736365] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:06.003 [2024-07-26 05:52:20.738163] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:06.003 [2024-07-26 05:52:20.738192] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:06.003 pt1 00:24:06.003 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:06.003 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:06.003 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:24:06.003 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:24:06.003 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:24:06.003 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:06.003 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:06.003 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:06.003 05:52:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:24:06.261 malloc2 00:24:06.261 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:06.519 [2024-07-26 05:52:21.227417] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:06.519 [2024-07-26 05:52:21.227463] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:06.519 [2024-07-26 05:52:21.227480] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2390970 00:24:06.519 [2024-07-26 05:52:21.227494] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:06.519 [2024-07-26 05:52:21.229092] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:06.519 [2024-07-26 05:52:21.229119] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:06.519 pt2 00:24:06.519 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:06.519 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:06.519 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:24:06.519 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:24:06.519 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:24:06.519 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:06.519 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:06.519 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:06.519 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:24:06.777 malloc3 00:24:06.777 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:24:07.035 [2024-07-26 05:52:21.713397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:24:07.035 [2024-07-26 05:52:21.713444] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:07.035 [2024-07-26 05:52:21.713462] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2527340 00:24:07.035 [2024-07-26 05:52:21.713475] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:07.035 [2024-07-26 05:52:21.715042] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:07.035 [2024-07-26 05:52:21.715069] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:24:07.035 pt3 00:24:07.035 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:07.035 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:07.035 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:24:07.035 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:24:07.035 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:24:07.035 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:24:07.035 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:24:07.035 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:24:07.035 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:24:07.292 malloc4 00:24:07.292 05:52:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:07.292 [2024-07-26 05:52:22.188525] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:07.292 [2024-07-26 05:52:22.188573] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:07.292 [2024-07-26 05:52:22.188594] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2529c60 00:24:07.292 [2024-07-26 05:52:22.188608] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:07.292 [2024-07-26 05:52:22.190175] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:07.292 [2024-07-26 05:52:22.190203] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:07.292 pt4 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:24:07.550 [2024-07-26 05:52:22.433200] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:07.550 [2024-07-26 05:52:22.434565] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:07.550 [2024-07-26 05:52:22.434622] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:07.550 [2024-07-26 05:52:22.434675] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:07.550 [2024-07-26 05:52:22.434855] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2387530 00:24:07.550 [2024-07-26 05:52:22.434866] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:07.550 [2024-07-26 05:52:22.435071] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2385770 00:24:07.550 [2024-07-26 05:52:22.435227] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2387530 00:24:07.550 [2024-07-26 05:52:22.435237] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2387530 00:24:07.550 [2024-07-26 05:52:22.435350] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:07.550 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:07.809 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.809 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.809 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:07.809 "name": "raid_bdev1", 00:24:07.809 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:07.809 "strip_size_kb": 0, 00:24:07.809 "state": "online", 00:24:07.809 "raid_level": "raid1", 00:24:07.809 "superblock": true, 00:24:07.809 "num_base_bdevs": 4, 00:24:07.809 "num_base_bdevs_discovered": 4, 00:24:07.809 "num_base_bdevs_operational": 4, 00:24:07.809 "base_bdevs_list": [ 00:24:07.809 { 00:24:07.809 "name": "pt1", 00:24:07.809 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:07.809 "is_configured": true, 00:24:07.809 "data_offset": 2048, 00:24:07.809 "data_size": 63488 00:24:07.809 }, 00:24:07.809 { 00:24:07.809 "name": "pt2", 00:24:07.809 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:07.809 "is_configured": true, 00:24:07.809 "data_offset": 2048, 00:24:07.809 "data_size": 63488 00:24:07.809 }, 00:24:07.809 { 00:24:07.809 "name": "pt3", 00:24:07.809 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:07.809 "is_configured": true, 00:24:07.809 "data_offset": 2048, 00:24:07.809 "data_size": 63488 00:24:07.809 }, 00:24:07.809 { 00:24:07.809 "name": "pt4", 00:24:07.809 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:07.809 "is_configured": true, 00:24:07.809 "data_offset": 2048, 00:24:07.809 "data_size": 63488 00:24:07.809 } 00:24:07.809 ] 00:24:07.809 }' 00:24:07.809 05:52:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:07.809 05:52:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:08.742 [2024-07-26 05:52:23.532385] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:08.742 "name": "raid_bdev1", 00:24:08.742 "aliases": [ 00:24:08.742 "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f" 00:24:08.742 ], 00:24:08.742 "product_name": "Raid Volume", 00:24:08.742 "block_size": 512, 00:24:08.742 "num_blocks": 63488, 00:24:08.742 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:08.742 "assigned_rate_limits": { 00:24:08.742 "rw_ios_per_sec": 0, 00:24:08.742 "rw_mbytes_per_sec": 0, 00:24:08.742 "r_mbytes_per_sec": 0, 00:24:08.742 "w_mbytes_per_sec": 0 00:24:08.742 }, 00:24:08.742 "claimed": false, 00:24:08.742 "zoned": false, 00:24:08.742 "supported_io_types": { 00:24:08.742 "read": true, 00:24:08.742 "write": true, 00:24:08.742 "unmap": false, 00:24:08.742 "flush": false, 00:24:08.742 "reset": true, 00:24:08.742 "nvme_admin": false, 00:24:08.742 "nvme_io": false, 00:24:08.742 "nvme_io_md": false, 00:24:08.742 "write_zeroes": true, 00:24:08.742 "zcopy": false, 00:24:08.742 "get_zone_info": false, 00:24:08.742 "zone_management": false, 00:24:08.742 "zone_append": false, 00:24:08.742 "compare": false, 00:24:08.742 "compare_and_write": false, 00:24:08.742 "abort": false, 00:24:08.742 "seek_hole": false, 00:24:08.742 "seek_data": false, 00:24:08.742 "copy": false, 00:24:08.742 "nvme_iov_md": false 00:24:08.742 }, 00:24:08.742 "memory_domains": [ 00:24:08.742 { 00:24:08.742 "dma_device_id": "system", 00:24:08.742 "dma_device_type": 1 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:08.742 "dma_device_type": 2 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "dma_device_id": "system", 00:24:08.742 "dma_device_type": 1 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:08.742 "dma_device_type": 2 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "dma_device_id": "system", 00:24:08.742 "dma_device_type": 1 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:08.742 "dma_device_type": 2 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "dma_device_id": "system", 00:24:08.742 "dma_device_type": 1 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:08.742 "dma_device_type": 2 00:24:08.742 } 00:24:08.742 ], 00:24:08.742 "driver_specific": { 00:24:08.742 "raid": { 00:24:08.742 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:08.742 "strip_size_kb": 0, 00:24:08.742 "state": "online", 00:24:08.742 "raid_level": "raid1", 00:24:08.742 "superblock": true, 00:24:08.742 "num_base_bdevs": 4, 00:24:08.742 "num_base_bdevs_discovered": 4, 00:24:08.742 "num_base_bdevs_operational": 4, 00:24:08.742 "base_bdevs_list": [ 00:24:08.742 { 00:24:08.742 "name": "pt1", 00:24:08.742 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:08.742 "is_configured": true, 00:24:08.742 "data_offset": 2048, 00:24:08.742 "data_size": 63488 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "name": "pt2", 00:24:08.742 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:08.742 "is_configured": true, 00:24:08.742 "data_offset": 2048, 00:24:08.742 "data_size": 63488 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "name": "pt3", 00:24:08.742 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:08.742 "is_configured": true, 00:24:08.742 "data_offset": 2048, 00:24:08.742 "data_size": 63488 00:24:08.742 }, 00:24:08.742 { 00:24:08.742 "name": "pt4", 00:24:08.742 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:08.742 "is_configured": true, 00:24:08.742 "data_offset": 2048, 00:24:08.742 "data_size": 63488 00:24:08.742 } 00:24:08.742 ] 00:24:08.742 } 00:24:08.742 } 00:24:08.742 }' 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:08.742 pt2 00:24:08.742 pt3 00:24:08.742 pt4' 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:08.742 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:09.000 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:09.000 "name": "pt1", 00:24:09.000 "aliases": [ 00:24:09.000 "00000000-0000-0000-0000-000000000001" 00:24:09.000 ], 00:24:09.000 "product_name": "passthru", 00:24:09.000 "block_size": 512, 00:24:09.000 "num_blocks": 65536, 00:24:09.000 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:09.000 "assigned_rate_limits": { 00:24:09.000 "rw_ios_per_sec": 0, 00:24:09.000 "rw_mbytes_per_sec": 0, 00:24:09.000 "r_mbytes_per_sec": 0, 00:24:09.000 "w_mbytes_per_sec": 0 00:24:09.000 }, 00:24:09.000 "claimed": true, 00:24:09.000 "claim_type": "exclusive_write", 00:24:09.000 "zoned": false, 00:24:09.000 "supported_io_types": { 00:24:09.000 "read": true, 00:24:09.000 "write": true, 00:24:09.000 "unmap": true, 00:24:09.000 "flush": true, 00:24:09.000 "reset": true, 00:24:09.000 "nvme_admin": false, 00:24:09.000 "nvme_io": false, 00:24:09.000 "nvme_io_md": false, 00:24:09.000 "write_zeroes": true, 00:24:09.000 "zcopy": true, 00:24:09.000 "get_zone_info": false, 00:24:09.000 "zone_management": false, 00:24:09.000 "zone_append": false, 00:24:09.000 "compare": false, 00:24:09.000 "compare_and_write": false, 00:24:09.000 "abort": true, 00:24:09.000 "seek_hole": false, 00:24:09.000 "seek_data": false, 00:24:09.000 "copy": true, 00:24:09.001 "nvme_iov_md": false 00:24:09.001 }, 00:24:09.001 "memory_domains": [ 00:24:09.001 { 00:24:09.001 "dma_device_id": "system", 00:24:09.001 "dma_device_type": 1 00:24:09.001 }, 00:24:09.001 { 00:24:09.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:09.001 "dma_device_type": 2 00:24:09.001 } 00:24:09.001 ], 00:24:09.001 "driver_specific": { 00:24:09.001 "passthru": { 00:24:09.001 "name": "pt1", 00:24:09.001 "base_bdev_name": "malloc1" 00:24:09.001 } 00:24:09.001 } 00:24:09.001 }' 00:24:09.001 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:09.001 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:09.258 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:09.258 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:09.258 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:09.258 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:09.258 05:52:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:09.258 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:09.258 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:09.258 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:09.258 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:09.516 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:09.517 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:09.517 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:09.517 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:09.774 "name": "pt2", 00:24:09.774 "aliases": [ 00:24:09.774 "00000000-0000-0000-0000-000000000002" 00:24:09.774 ], 00:24:09.774 "product_name": "passthru", 00:24:09.774 "block_size": 512, 00:24:09.774 "num_blocks": 65536, 00:24:09.774 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:09.774 "assigned_rate_limits": { 00:24:09.774 "rw_ios_per_sec": 0, 00:24:09.774 "rw_mbytes_per_sec": 0, 00:24:09.774 "r_mbytes_per_sec": 0, 00:24:09.774 "w_mbytes_per_sec": 0 00:24:09.774 }, 00:24:09.774 "claimed": true, 00:24:09.774 "claim_type": "exclusive_write", 00:24:09.774 "zoned": false, 00:24:09.774 "supported_io_types": { 00:24:09.774 "read": true, 00:24:09.774 "write": true, 00:24:09.774 "unmap": true, 00:24:09.774 "flush": true, 00:24:09.774 "reset": true, 00:24:09.774 "nvme_admin": false, 00:24:09.774 "nvme_io": false, 00:24:09.774 "nvme_io_md": false, 00:24:09.774 "write_zeroes": true, 00:24:09.774 "zcopy": true, 00:24:09.774 "get_zone_info": false, 00:24:09.774 "zone_management": false, 00:24:09.774 "zone_append": false, 00:24:09.774 "compare": false, 00:24:09.774 "compare_and_write": false, 00:24:09.774 "abort": true, 00:24:09.774 "seek_hole": false, 00:24:09.774 "seek_data": false, 00:24:09.774 "copy": true, 00:24:09.774 "nvme_iov_md": false 00:24:09.774 }, 00:24:09.774 "memory_domains": [ 00:24:09.774 { 00:24:09.774 "dma_device_id": "system", 00:24:09.774 "dma_device_type": 1 00:24:09.774 }, 00:24:09.774 { 00:24:09.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:09.774 "dma_device_type": 2 00:24:09.774 } 00:24:09.774 ], 00:24:09.774 "driver_specific": { 00:24:09.774 "passthru": { 00:24:09.774 "name": "pt2", 00:24:09.774 "base_bdev_name": "malloc2" 00:24:09.774 } 00:24:09.774 } 00:24:09.774 }' 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:09.774 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:10.032 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:10.032 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:10.032 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:10.032 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:10.032 05:52:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:24:10.291 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:10.291 "name": "pt3", 00:24:10.291 "aliases": [ 00:24:10.291 "00000000-0000-0000-0000-000000000003" 00:24:10.291 ], 00:24:10.291 "product_name": "passthru", 00:24:10.291 "block_size": 512, 00:24:10.291 "num_blocks": 65536, 00:24:10.291 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:10.291 "assigned_rate_limits": { 00:24:10.291 "rw_ios_per_sec": 0, 00:24:10.291 "rw_mbytes_per_sec": 0, 00:24:10.291 "r_mbytes_per_sec": 0, 00:24:10.291 "w_mbytes_per_sec": 0 00:24:10.291 }, 00:24:10.291 "claimed": true, 00:24:10.291 "claim_type": "exclusive_write", 00:24:10.291 "zoned": false, 00:24:10.291 "supported_io_types": { 00:24:10.291 "read": true, 00:24:10.291 "write": true, 00:24:10.291 "unmap": true, 00:24:10.291 "flush": true, 00:24:10.291 "reset": true, 00:24:10.291 "nvme_admin": false, 00:24:10.291 "nvme_io": false, 00:24:10.291 "nvme_io_md": false, 00:24:10.291 "write_zeroes": true, 00:24:10.291 "zcopy": true, 00:24:10.291 "get_zone_info": false, 00:24:10.291 "zone_management": false, 00:24:10.291 "zone_append": false, 00:24:10.291 "compare": false, 00:24:10.291 "compare_and_write": false, 00:24:10.291 "abort": true, 00:24:10.291 "seek_hole": false, 00:24:10.291 "seek_data": false, 00:24:10.291 "copy": true, 00:24:10.291 "nvme_iov_md": false 00:24:10.291 }, 00:24:10.291 "memory_domains": [ 00:24:10.291 { 00:24:10.291 "dma_device_id": "system", 00:24:10.291 "dma_device_type": 1 00:24:10.291 }, 00:24:10.291 { 00:24:10.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:10.291 "dma_device_type": 2 00:24:10.291 } 00:24:10.291 ], 00:24:10.291 "driver_specific": { 00:24:10.291 "passthru": { 00:24:10.291 "name": "pt3", 00:24:10.291 "base_bdev_name": "malloc3" 00:24:10.291 } 00:24:10.291 } 00:24:10.291 }' 00:24:10.291 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:10.291 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:10.291 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:10.291 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:10.291 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:10.291 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:10.291 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:10.549 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:10.549 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:10.549 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:10.549 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:10.549 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:10.549 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:10.549 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:10.549 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:24:10.807 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:10.807 "name": "pt4", 00:24:10.807 "aliases": [ 00:24:10.807 "00000000-0000-0000-0000-000000000004" 00:24:10.807 ], 00:24:10.807 "product_name": "passthru", 00:24:10.807 "block_size": 512, 00:24:10.807 "num_blocks": 65536, 00:24:10.807 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:10.807 "assigned_rate_limits": { 00:24:10.807 "rw_ios_per_sec": 0, 00:24:10.807 "rw_mbytes_per_sec": 0, 00:24:10.807 "r_mbytes_per_sec": 0, 00:24:10.807 "w_mbytes_per_sec": 0 00:24:10.807 }, 00:24:10.807 "claimed": true, 00:24:10.807 "claim_type": "exclusive_write", 00:24:10.807 "zoned": false, 00:24:10.807 "supported_io_types": { 00:24:10.807 "read": true, 00:24:10.807 "write": true, 00:24:10.807 "unmap": true, 00:24:10.807 "flush": true, 00:24:10.807 "reset": true, 00:24:10.807 "nvme_admin": false, 00:24:10.807 "nvme_io": false, 00:24:10.807 "nvme_io_md": false, 00:24:10.807 "write_zeroes": true, 00:24:10.807 "zcopy": true, 00:24:10.807 "get_zone_info": false, 00:24:10.807 "zone_management": false, 00:24:10.807 "zone_append": false, 00:24:10.807 "compare": false, 00:24:10.807 "compare_and_write": false, 00:24:10.807 "abort": true, 00:24:10.807 "seek_hole": false, 00:24:10.807 "seek_data": false, 00:24:10.807 "copy": true, 00:24:10.807 "nvme_iov_md": false 00:24:10.807 }, 00:24:10.807 "memory_domains": [ 00:24:10.807 { 00:24:10.807 "dma_device_id": "system", 00:24:10.807 "dma_device_type": 1 00:24:10.807 }, 00:24:10.807 { 00:24:10.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:10.807 "dma_device_type": 2 00:24:10.807 } 00:24:10.807 ], 00:24:10.807 "driver_specific": { 00:24:10.807 "passthru": { 00:24:10.807 "name": "pt4", 00:24:10.807 "base_bdev_name": "malloc4" 00:24:10.807 } 00:24:10.807 } 00:24:10.807 }' 00:24:10.807 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:10.807 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:10.807 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:10.807 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:10.807 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:11.065 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:11.065 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:11.065 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:11.065 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:11.065 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:11.065 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:11.065 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:11.065 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:11.065 05:52:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:24:11.322 [2024-07-26 05:52:26.127233] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:11.322 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=129e1f87-ddb0-400d-b52b-d18c7dcbfd1f 00:24:11.322 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 129e1f87-ddb0-400d-b52b-d18c7dcbfd1f ']' 00:24:11.322 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:11.579 [2024-07-26 05:52:26.371589] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:11.579 [2024-07-26 05:52:26.371612] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:11.579 [2024-07-26 05:52:26.371670] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:11.579 [2024-07-26 05:52:26.371754] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:11.579 [2024-07-26 05:52:26.371766] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2387530 name raid_bdev1, state offline 00:24:11.579 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:24:11.579 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.836 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:24:11.836 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:24:11.836 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:11.836 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:12.094 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:12.094 05:52:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:12.352 05:52:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:12.352 05:52:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:24:12.610 05:52:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:24:12.610 05:52:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:24:12.868 05:52:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:24:12.868 05:52:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:13.434 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:24:13.692 [2024-07-26 05:52:28.356737] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:24:13.692 [2024-07-26 05:52:28.358072] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:24:13.693 [2024-07-26 05:52:28.358115] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:24:13.693 [2024-07-26 05:52:28.358148] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:24:13.693 [2024-07-26 05:52:28.358192] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:24:13.693 [2024-07-26 05:52:28.358230] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:24:13.693 [2024-07-26 05:52:28.358252] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:24:13.693 [2024-07-26 05:52:28.358274] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:24:13.693 [2024-07-26 05:52:28.358292] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:13.693 [2024-07-26 05:52:28.358303] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2532ff0 name raid_bdev1, state configuring 00:24:13.693 request: 00:24:13.693 { 00:24:13.693 "name": "raid_bdev1", 00:24:13.693 "raid_level": "raid1", 00:24:13.693 "base_bdevs": [ 00:24:13.693 "malloc1", 00:24:13.693 "malloc2", 00:24:13.693 "malloc3", 00:24:13.693 "malloc4" 00:24:13.693 ], 00:24:13.693 "superblock": false, 00:24:13.693 "method": "bdev_raid_create", 00:24:13.693 "req_id": 1 00:24:13.693 } 00:24:13.693 Got JSON-RPC error response 00:24:13.693 response: 00:24:13.693 { 00:24:13.693 "code": -17, 00:24:13.693 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:24:13.693 } 00:24:13.693 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:24:13.693 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:13.693 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:13.693 05:52:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:13.693 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.693 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:24:13.951 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:24:13.951 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:24:13.951 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:13.951 [2024-07-26 05:52:28.849976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:13.951 [2024-07-26 05:52:28.850015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:13.951 [2024-07-26 05:52:28.850035] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238f7a0 00:24:13.951 [2024-07-26 05:52:28.850048] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:13.951 [2024-07-26 05:52:28.851669] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:13.951 [2024-07-26 05:52:28.851698] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:13.951 [2024-07-26 05:52:28.851761] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:13.951 [2024-07-26 05:52:28.851787] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:13.951 pt1 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.210 05:52:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.468 05:52:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:14.468 "name": "raid_bdev1", 00:24:14.468 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:14.468 "strip_size_kb": 0, 00:24:14.468 "state": "configuring", 00:24:14.468 "raid_level": "raid1", 00:24:14.468 "superblock": true, 00:24:14.468 "num_base_bdevs": 4, 00:24:14.468 "num_base_bdevs_discovered": 1, 00:24:14.468 "num_base_bdevs_operational": 4, 00:24:14.468 "base_bdevs_list": [ 00:24:14.468 { 00:24:14.468 "name": "pt1", 00:24:14.468 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:14.468 "is_configured": true, 00:24:14.468 "data_offset": 2048, 00:24:14.468 "data_size": 63488 00:24:14.468 }, 00:24:14.468 { 00:24:14.468 "name": null, 00:24:14.468 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:14.468 "is_configured": false, 00:24:14.468 "data_offset": 2048, 00:24:14.468 "data_size": 63488 00:24:14.468 }, 00:24:14.468 { 00:24:14.468 "name": null, 00:24:14.468 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:14.468 "is_configured": false, 00:24:14.468 "data_offset": 2048, 00:24:14.468 "data_size": 63488 00:24:14.468 }, 00:24:14.468 { 00:24:14.468 "name": null, 00:24:14.468 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:14.468 "is_configured": false, 00:24:14.468 "data_offset": 2048, 00:24:14.468 "data_size": 63488 00:24:14.468 } 00:24:14.468 ] 00:24:14.468 }' 00:24:14.468 05:52:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:14.468 05:52:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:15.034 05:52:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:24:15.034 05:52:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:15.034 [2024-07-26 05:52:29.868692] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:15.034 [2024-07-26 05:52:29.868740] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:15.034 [2024-07-26 05:52:29.868760] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2528940 00:24:15.034 [2024-07-26 05:52:29.868773] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:15.034 [2024-07-26 05:52:29.869110] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:15.034 [2024-07-26 05:52:29.869128] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:15.034 [2024-07-26 05:52:29.869189] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:15.034 [2024-07-26 05:52:29.869208] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:15.034 pt2 00:24:15.034 05:52:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:15.292 [2024-07-26 05:52:30.049171] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.292 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.550 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:15.550 "name": "raid_bdev1", 00:24:15.550 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:15.550 "strip_size_kb": 0, 00:24:15.550 "state": "configuring", 00:24:15.550 "raid_level": "raid1", 00:24:15.550 "superblock": true, 00:24:15.550 "num_base_bdevs": 4, 00:24:15.550 "num_base_bdevs_discovered": 1, 00:24:15.550 "num_base_bdevs_operational": 4, 00:24:15.550 "base_bdevs_list": [ 00:24:15.550 { 00:24:15.550 "name": "pt1", 00:24:15.550 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:15.550 "is_configured": true, 00:24:15.550 "data_offset": 2048, 00:24:15.550 "data_size": 63488 00:24:15.550 }, 00:24:15.550 { 00:24:15.550 "name": null, 00:24:15.550 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:15.550 "is_configured": false, 00:24:15.550 "data_offset": 2048, 00:24:15.550 "data_size": 63488 00:24:15.550 }, 00:24:15.550 { 00:24:15.550 "name": null, 00:24:15.550 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:15.550 "is_configured": false, 00:24:15.550 "data_offset": 2048, 00:24:15.550 "data_size": 63488 00:24:15.550 }, 00:24:15.550 { 00:24:15.550 "name": null, 00:24:15.550 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:15.550 "is_configured": false, 00:24:15.550 "data_offset": 2048, 00:24:15.550 "data_size": 63488 00:24:15.550 } 00:24:15.550 ] 00:24:15.550 }' 00:24:15.550 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:15.550 05:52:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:16.115 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:24:16.115 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:16.115 05:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:16.374 [2024-07-26 05:52:31.059849] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:16.374 [2024-07-26 05:52:31.059897] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:16.374 [2024-07-26 05:52:31.059915] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2386060 00:24:16.374 [2024-07-26 05:52:31.059928] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:16.374 [2024-07-26 05:52:31.060259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:16.374 [2024-07-26 05:52:31.060276] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:16.374 [2024-07-26 05:52:31.060338] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:16.374 [2024-07-26 05:52:31.060357] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:16.374 pt2 00:24:16.374 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:24:16.374 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:16.374 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:24:16.374 [2024-07-26 05:52:31.240330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:24:16.374 [2024-07-26 05:52:31.240359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:16.374 [2024-07-26 05:52:31.240376] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23888d0 00:24:16.374 [2024-07-26 05:52:31.240387] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:16.374 [2024-07-26 05:52:31.240674] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:16.374 [2024-07-26 05:52:31.240693] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:24:16.374 [2024-07-26 05:52:31.240742] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:24:16.374 [2024-07-26 05:52:31.240759] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:16.374 pt3 00:24:16.374 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:24:16.374 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:16.374 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:16.632 [2024-07-26 05:52:31.412788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:16.632 [2024-07-26 05:52:31.412823] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:16.632 [2024-07-26 05:52:31.412838] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2389b80 00:24:16.632 [2024-07-26 05:52:31.412850] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:16.632 [2024-07-26 05:52:31.413124] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:16.632 [2024-07-26 05:52:31.413141] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:16.632 [2024-07-26 05:52:31.413188] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:24:16.632 [2024-07-26 05:52:31.413205] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:16.632 [2024-07-26 05:52:31.413322] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2386780 00:24:16.632 [2024-07-26 05:52:31.413332] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:16.632 [2024-07-26 05:52:31.413501] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x238bfa0 00:24:16.632 [2024-07-26 05:52:31.413634] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2386780 00:24:16.632 [2024-07-26 05:52:31.413653] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2386780 00:24:16.632 [2024-07-26 05:52:31.413751] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:16.632 pt4 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.632 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.890 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:16.890 "name": "raid_bdev1", 00:24:16.890 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:16.890 "strip_size_kb": 0, 00:24:16.890 "state": "online", 00:24:16.890 "raid_level": "raid1", 00:24:16.890 "superblock": true, 00:24:16.890 "num_base_bdevs": 4, 00:24:16.890 "num_base_bdevs_discovered": 4, 00:24:16.890 "num_base_bdevs_operational": 4, 00:24:16.890 "base_bdevs_list": [ 00:24:16.890 { 00:24:16.890 "name": "pt1", 00:24:16.890 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:16.890 "is_configured": true, 00:24:16.890 "data_offset": 2048, 00:24:16.890 "data_size": 63488 00:24:16.890 }, 00:24:16.890 { 00:24:16.890 "name": "pt2", 00:24:16.890 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:16.890 "is_configured": true, 00:24:16.890 "data_offset": 2048, 00:24:16.890 "data_size": 63488 00:24:16.890 }, 00:24:16.890 { 00:24:16.890 "name": "pt3", 00:24:16.890 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:16.890 "is_configured": true, 00:24:16.890 "data_offset": 2048, 00:24:16.890 "data_size": 63488 00:24:16.890 }, 00:24:16.890 { 00:24:16.890 "name": "pt4", 00:24:16.890 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:16.890 "is_configured": true, 00:24:16.890 "data_offset": 2048, 00:24:16.890 "data_size": 63488 00:24:16.890 } 00:24:16.890 ] 00:24:16.890 }' 00:24:16.891 05:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:16.891 05:52:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:17.457 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:24:17.457 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:24:17.457 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:24:17.457 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:24:17.457 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:24:17.457 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:24:17.457 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:17.457 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:24:17.715 [2024-07-26 05:52:32.375650] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:17.715 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:24:17.715 "name": "raid_bdev1", 00:24:17.715 "aliases": [ 00:24:17.715 "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f" 00:24:17.715 ], 00:24:17.715 "product_name": "Raid Volume", 00:24:17.715 "block_size": 512, 00:24:17.715 "num_blocks": 63488, 00:24:17.715 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:17.715 "assigned_rate_limits": { 00:24:17.715 "rw_ios_per_sec": 0, 00:24:17.715 "rw_mbytes_per_sec": 0, 00:24:17.715 "r_mbytes_per_sec": 0, 00:24:17.715 "w_mbytes_per_sec": 0 00:24:17.715 }, 00:24:17.715 "claimed": false, 00:24:17.715 "zoned": false, 00:24:17.715 "supported_io_types": { 00:24:17.715 "read": true, 00:24:17.715 "write": true, 00:24:17.715 "unmap": false, 00:24:17.715 "flush": false, 00:24:17.715 "reset": true, 00:24:17.715 "nvme_admin": false, 00:24:17.715 "nvme_io": false, 00:24:17.715 "nvme_io_md": false, 00:24:17.715 "write_zeroes": true, 00:24:17.715 "zcopy": false, 00:24:17.715 "get_zone_info": false, 00:24:17.715 "zone_management": false, 00:24:17.715 "zone_append": false, 00:24:17.715 "compare": false, 00:24:17.715 "compare_and_write": false, 00:24:17.715 "abort": false, 00:24:17.715 "seek_hole": false, 00:24:17.715 "seek_data": false, 00:24:17.715 "copy": false, 00:24:17.715 "nvme_iov_md": false 00:24:17.715 }, 00:24:17.715 "memory_domains": [ 00:24:17.715 { 00:24:17.715 "dma_device_id": "system", 00:24:17.715 "dma_device_type": 1 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:17.715 "dma_device_type": 2 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "dma_device_id": "system", 00:24:17.715 "dma_device_type": 1 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:17.715 "dma_device_type": 2 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "dma_device_id": "system", 00:24:17.715 "dma_device_type": 1 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:17.715 "dma_device_type": 2 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "dma_device_id": "system", 00:24:17.715 "dma_device_type": 1 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:17.715 "dma_device_type": 2 00:24:17.715 } 00:24:17.715 ], 00:24:17.715 "driver_specific": { 00:24:17.715 "raid": { 00:24:17.715 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:17.715 "strip_size_kb": 0, 00:24:17.715 "state": "online", 00:24:17.715 "raid_level": "raid1", 00:24:17.715 "superblock": true, 00:24:17.715 "num_base_bdevs": 4, 00:24:17.715 "num_base_bdevs_discovered": 4, 00:24:17.715 "num_base_bdevs_operational": 4, 00:24:17.715 "base_bdevs_list": [ 00:24:17.715 { 00:24:17.715 "name": "pt1", 00:24:17.715 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:17.715 "is_configured": true, 00:24:17.715 "data_offset": 2048, 00:24:17.715 "data_size": 63488 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "name": "pt2", 00:24:17.715 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:17.715 "is_configured": true, 00:24:17.715 "data_offset": 2048, 00:24:17.715 "data_size": 63488 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "name": "pt3", 00:24:17.715 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:17.715 "is_configured": true, 00:24:17.715 "data_offset": 2048, 00:24:17.715 "data_size": 63488 00:24:17.715 }, 00:24:17.715 { 00:24:17.715 "name": "pt4", 00:24:17.715 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:17.715 "is_configured": true, 00:24:17.715 "data_offset": 2048, 00:24:17.715 "data_size": 63488 00:24:17.715 } 00:24:17.715 ] 00:24:17.715 } 00:24:17.715 } 00:24:17.715 }' 00:24:17.716 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:24:17.716 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:24:17.716 pt2 00:24:17.716 pt3 00:24:17.716 pt4' 00:24:17.716 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:17.716 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:24:17.716 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:17.974 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:17.974 "name": "pt1", 00:24:17.974 "aliases": [ 00:24:17.974 "00000000-0000-0000-0000-000000000001" 00:24:17.974 ], 00:24:17.974 "product_name": "passthru", 00:24:17.974 "block_size": 512, 00:24:17.974 "num_blocks": 65536, 00:24:17.974 "uuid": "00000000-0000-0000-0000-000000000001", 00:24:17.974 "assigned_rate_limits": { 00:24:17.974 "rw_ios_per_sec": 0, 00:24:17.974 "rw_mbytes_per_sec": 0, 00:24:17.974 "r_mbytes_per_sec": 0, 00:24:17.974 "w_mbytes_per_sec": 0 00:24:17.974 }, 00:24:17.974 "claimed": true, 00:24:17.974 "claim_type": "exclusive_write", 00:24:17.974 "zoned": false, 00:24:17.974 "supported_io_types": { 00:24:17.974 "read": true, 00:24:17.974 "write": true, 00:24:17.974 "unmap": true, 00:24:17.974 "flush": true, 00:24:17.974 "reset": true, 00:24:17.974 "nvme_admin": false, 00:24:17.974 "nvme_io": false, 00:24:17.974 "nvme_io_md": false, 00:24:17.974 "write_zeroes": true, 00:24:17.974 "zcopy": true, 00:24:17.974 "get_zone_info": false, 00:24:17.974 "zone_management": false, 00:24:17.974 "zone_append": false, 00:24:17.974 "compare": false, 00:24:17.974 "compare_and_write": false, 00:24:17.974 "abort": true, 00:24:17.974 "seek_hole": false, 00:24:17.974 "seek_data": false, 00:24:17.974 "copy": true, 00:24:17.974 "nvme_iov_md": false 00:24:17.974 }, 00:24:17.974 "memory_domains": [ 00:24:17.974 { 00:24:17.974 "dma_device_id": "system", 00:24:17.974 "dma_device_type": 1 00:24:17.974 }, 00:24:17.974 { 00:24:17.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:17.974 "dma_device_type": 2 00:24:17.974 } 00:24:17.974 ], 00:24:17.974 "driver_specific": { 00:24:17.974 "passthru": { 00:24:17.974 "name": "pt1", 00:24:17.974 "base_bdev_name": "malloc1" 00:24:17.974 } 00:24:17.974 } 00:24:17.974 }' 00:24:17.974 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:17.974 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:17.974 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:17.974 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:17.974 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:17.974 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:17.974 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:18.232 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:18.232 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:18.232 05:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:18.232 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:18.232 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:18.232 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:18.232 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:24:18.232 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:18.490 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:18.490 "name": "pt2", 00:24:18.490 "aliases": [ 00:24:18.490 "00000000-0000-0000-0000-000000000002" 00:24:18.490 ], 00:24:18.490 "product_name": "passthru", 00:24:18.490 "block_size": 512, 00:24:18.490 "num_blocks": 65536, 00:24:18.490 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:18.490 "assigned_rate_limits": { 00:24:18.490 "rw_ios_per_sec": 0, 00:24:18.490 "rw_mbytes_per_sec": 0, 00:24:18.490 "r_mbytes_per_sec": 0, 00:24:18.490 "w_mbytes_per_sec": 0 00:24:18.490 }, 00:24:18.490 "claimed": true, 00:24:18.490 "claim_type": "exclusive_write", 00:24:18.490 "zoned": false, 00:24:18.490 "supported_io_types": { 00:24:18.490 "read": true, 00:24:18.490 "write": true, 00:24:18.490 "unmap": true, 00:24:18.490 "flush": true, 00:24:18.490 "reset": true, 00:24:18.490 "nvme_admin": false, 00:24:18.490 "nvme_io": false, 00:24:18.490 "nvme_io_md": false, 00:24:18.490 "write_zeroes": true, 00:24:18.490 "zcopy": true, 00:24:18.490 "get_zone_info": false, 00:24:18.490 "zone_management": false, 00:24:18.490 "zone_append": false, 00:24:18.490 "compare": false, 00:24:18.490 "compare_and_write": false, 00:24:18.490 "abort": true, 00:24:18.490 "seek_hole": false, 00:24:18.490 "seek_data": false, 00:24:18.490 "copy": true, 00:24:18.490 "nvme_iov_md": false 00:24:18.490 }, 00:24:18.490 "memory_domains": [ 00:24:18.490 { 00:24:18.490 "dma_device_id": "system", 00:24:18.490 "dma_device_type": 1 00:24:18.490 }, 00:24:18.490 { 00:24:18.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:18.490 "dma_device_type": 2 00:24:18.490 } 00:24:18.490 ], 00:24:18.490 "driver_specific": { 00:24:18.490 "passthru": { 00:24:18.490 "name": "pt2", 00:24:18.490 "base_bdev_name": "malloc2" 00:24:18.490 } 00:24:18.490 } 00:24:18.490 }' 00:24:18.490 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:18.490 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:24:18.775 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:19.032 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:19.032 "name": "pt3", 00:24:19.032 "aliases": [ 00:24:19.032 "00000000-0000-0000-0000-000000000003" 00:24:19.032 ], 00:24:19.032 "product_name": "passthru", 00:24:19.032 "block_size": 512, 00:24:19.032 "num_blocks": 65536, 00:24:19.032 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:19.032 "assigned_rate_limits": { 00:24:19.032 "rw_ios_per_sec": 0, 00:24:19.032 "rw_mbytes_per_sec": 0, 00:24:19.032 "r_mbytes_per_sec": 0, 00:24:19.032 "w_mbytes_per_sec": 0 00:24:19.032 }, 00:24:19.032 "claimed": true, 00:24:19.032 "claim_type": "exclusive_write", 00:24:19.032 "zoned": false, 00:24:19.032 "supported_io_types": { 00:24:19.032 "read": true, 00:24:19.032 "write": true, 00:24:19.032 "unmap": true, 00:24:19.032 "flush": true, 00:24:19.032 "reset": true, 00:24:19.032 "nvme_admin": false, 00:24:19.032 "nvme_io": false, 00:24:19.032 "nvme_io_md": false, 00:24:19.032 "write_zeroes": true, 00:24:19.032 "zcopy": true, 00:24:19.032 "get_zone_info": false, 00:24:19.032 "zone_management": false, 00:24:19.032 "zone_append": false, 00:24:19.032 "compare": false, 00:24:19.032 "compare_and_write": false, 00:24:19.032 "abort": true, 00:24:19.032 "seek_hole": false, 00:24:19.032 "seek_data": false, 00:24:19.032 "copy": true, 00:24:19.032 "nvme_iov_md": false 00:24:19.032 }, 00:24:19.032 "memory_domains": [ 00:24:19.032 { 00:24:19.032 "dma_device_id": "system", 00:24:19.032 "dma_device_type": 1 00:24:19.032 }, 00:24:19.032 { 00:24:19.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:19.032 "dma_device_type": 2 00:24:19.032 } 00:24:19.032 ], 00:24:19.032 "driver_specific": { 00:24:19.032 "passthru": { 00:24:19.032 "name": "pt3", 00:24:19.032 "base_bdev_name": "malloc3" 00:24:19.032 } 00:24:19.032 } 00:24:19.032 }' 00:24:19.032 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:19.290 05:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:19.290 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:19.290 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:19.290 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:19.290 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:19.290 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:19.290 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:19.290 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:19.290 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:19.548 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:19.548 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:19.548 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:24:19.548 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:24:19.548 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:24:19.805 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:24:19.805 "name": "pt4", 00:24:19.805 "aliases": [ 00:24:19.806 "00000000-0000-0000-0000-000000000004" 00:24:19.806 ], 00:24:19.806 "product_name": "passthru", 00:24:19.806 "block_size": 512, 00:24:19.806 "num_blocks": 65536, 00:24:19.806 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:19.806 "assigned_rate_limits": { 00:24:19.806 "rw_ios_per_sec": 0, 00:24:19.806 "rw_mbytes_per_sec": 0, 00:24:19.806 "r_mbytes_per_sec": 0, 00:24:19.806 "w_mbytes_per_sec": 0 00:24:19.806 }, 00:24:19.806 "claimed": true, 00:24:19.806 "claim_type": "exclusive_write", 00:24:19.806 "zoned": false, 00:24:19.806 "supported_io_types": { 00:24:19.806 "read": true, 00:24:19.806 "write": true, 00:24:19.806 "unmap": true, 00:24:19.806 "flush": true, 00:24:19.806 "reset": true, 00:24:19.806 "nvme_admin": false, 00:24:19.806 "nvme_io": false, 00:24:19.806 "nvme_io_md": false, 00:24:19.806 "write_zeroes": true, 00:24:19.806 "zcopy": true, 00:24:19.806 "get_zone_info": false, 00:24:19.806 "zone_management": false, 00:24:19.806 "zone_append": false, 00:24:19.806 "compare": false, 00:24:19.806 "compare_and_write": false, 00:24:19.806 "abort": true, 00:24:19.806 "seek_hole": false, 00:24:19.806 "seek_data": false, 00:24:19.806 "copy": true, 00:24:19.806 "nvme_iov_md": false 00:24:19.806 }, 00:24:19.806 "memory_domains": [ 00:24:19.806 { 00:24:19.806 "dma_device_id": "system", 00:24:19.806 "dma_device_type": 1 00:24:19.806 }, 00:24:19.806 { 00:24:19.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:24:19.806 "dma_device_type": 2 00:24:19.806 } 00:24:19.806 ], 00:24:19.806 "driver_specific": { 00:24:19.806 "passthru": { 00:24:19.806 "name": "pt4", 00:24:19.806 "base_bdev_name": "malloc4" 00:24:19.806 } 00:24:19.806 } 00:24:19.806 }' 00:24:19.806 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:19.806 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:24:19.806 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:24:19.806 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:19.806 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:24:19.806 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:24:19.806 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:20.063 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:24:20.063 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:24:20.063 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:20.063 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:24:20.063 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:24:20.063 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:20.063 05:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:24:20.321 [2024-07-26 05:52:35.090870] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:20.321 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 129e1f87-ddb0-400d-b52b-d18c7dcbfd1f '!=' 129e1f87-ddb0-400d-b52b-d18c7dcbfd1f ']' 00:24:20.321 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:24:20.321 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:20.321 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:24:20.321 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:24:20.580 [2024-07-26 05:52:35.339255] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:20.580 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:20.838 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:20.838 "name": "raid_bdev1", 00:24:20.838 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:20.838 "strip_size_kb": 0, 00:24:20.838 "state": "online", 00:24:20.838 "raid_level": "raid1", 00:24:20.838 "superblock": true, 00:24:20.838 "num_base_bdevs": 4, 00:24:20.838 "num_base_bdevs_discovered": 3, 00:24:20.838 "num_base_bdevs_operational": 3, 00:24:20.838 "base_bdevs_list": [ 00:24:20.838 { 00:24:20.838 "name": null, 00:24:20.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:20.838 "is_configured": false, 00:24:20.838 "data_offset": 2048, 00:24:20.838 "data_size": 63488 00:24:20.838 }, 00:24:20.838 { 00:24:20.838 "name": "pt2", 00:24:20.838 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:20.838 "is_configured": true, 00:24:20.838 "data_offset": 2048, 00:24:20.838 "data_size": 63488 00:24:20.838 }, 00:24:20.838 { 00:24:20.838 "name": "pt3", 00:24:20.838 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:20.838 "is_configured": true, 00:24:20.838 "data_offset": 2048, 00:24:20.838 "data_size": 63488 00:24:20.838 }, 00:24:20.838 { 00:24:20.838 "name": "pt4", 00:24:20.838 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:20.838 "is_configured": true, 00:24:20.838 "data_offset": 2048, 00:24:20.838 "data_size": 63488 00:24:20.838 } 00:24:20.838 ] 00:24:20.838 }' 00:24:20.838 05:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:20.838 05:52:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:21.404 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:21.662 [2024-07-26 05:52:36.430134] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:21.662 [2024-07-26 05:52:36.430157] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:21.662 [2024-07-26 05:52:36.430216] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:21.662 [2024-07-26 05:52:36.430279] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:21.662 [2024-07-26 05:52:36.430290] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2386780 name raid_bdev1, state offline 00:24:21.662 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.662 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:24:21.920 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:24:21.920 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:24:21.920 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:24:21.920 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:21.920 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:22.178 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:22.178 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:22.178 05:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:24:22.437 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:22.437 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:22.437 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:24:22.695 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:22.695 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:22.695 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:24:22.695 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:22.695 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:22.953 [2024-07-26 05:52:37.653287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:22.953 [2024-07-26 05:52:37.653334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.953 [2024-07-26 05:52:37.653353] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2529700 00:24:22.953 [2024-07-26 05:52:37.653366] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.953 [2024-07-26 05:52:37.654957] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.953 [2024-07-26 05:52:37.654984] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:22.953 [2024-07-26 05:52:37.655049] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:22.953 [2024-07-26 05:52:37.655075] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:22.953 pt2 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.953 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.212 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.212 "name": "raid_bdev1", 00:24:23.212 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:23.212 "strip_size_kb": 0, 00:24:23.212 "state": "configuring", 00:24:23.212 "raid_level": "raid1", 00:24:23.212 "superblock": true, 00:24:23.212 "num_base_bdevs": 4, 00:24:23.212 "num_base_bdevs_discovered": 1, 00:24:23.212 "num_base_bdevs_operational": 3, 00:24:23.212 "base_bdevs_list": [ 00:24:23.212 { 00:24:23.212 "name": null, 00:24:23.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.212 "is_configured": false, 00:24:23.212 "data_offset": 2048, 00:24:23.212 "data_size": 63488 00:24:23.212 }, 00:24:23.212 { 00:24:23.212 "name": "pt2", 00:24:23.212 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:23.212 "is_configured": true, 00:24:23.212 "data_offset": 2048, 00:24:23.212 "data_size": 63488 00:24:23.212 }, 00:24:23.212 { 00:24:23.212 "name": null, 00:24:23.212 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:23.212 "is_configured": false, 00:24:23.212 "data_offset": 2048, 00:24:23.212 "data_size": 63488 00:24:23.212 }, 00:24:23.212 { 00:24:23.212 "name": null, 00:24:23.212 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:23.212 "is_configured": false, 00:24:23.212 "data_offset": 2048, 00:24:23.212 "data_size": 63488 00:24:23.212 } 00:24:23.212 ] 00:24:23.212 }' 00:24:23.212 05:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.212 05:52:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:23.778 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:24:23.778 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:23.779 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:24:24.036 [2024-07-26 05:52:38.724136] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:24:24.036 [2024-07-26 05:52:38.724178] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:24.036 [2024-07-26 05:52:38.724198] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238fa10 00:24:24.036 [2024-07-26 05:52:38.724216] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:24.036 [2024-07-26 05:52:38.724543] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:24.036 [2024-07-26 05:52:38.724559] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:24:24.036 [2024-07-26 05:52:38.724616] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:24:24.036 [2024-07-26 05:52:38.724633] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:24.036 pt3 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.036 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.294 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.294 "name": "raid_bdev1", 00:24:24.294 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:24.294 "strip_size_kb": 0, 00:24:24.294 "state": "configuring", 00:24:24.294 "raid_level": "raid1", 00:24:24.294 "superblock": true, 00:24:24.294 "num_base_bdevs": 4, 00:24:24.294 "num_base_bdevs_discovered": 2, 00:24:24.294 "num_base_bdevs_operational": 3, 00:24:24.294 "base_bdevs_list": [ 00:24:24.294 { 00:24:24.294 "name": null, 00:24:24.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.294 "is_configured": false, 00:24:24.294 "data_offset": 2048, 00:24:24.294 "data_size": 63488 00:24:24.294 }, 00:24:24.294 { 00:24:24.294 "name": "pt2", 00:24:24.294 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:24.294 "is_configured": true, 00:24:24.294 "data_offset": 2048, 00:24:24.294 "data_size": 63488 00:24:24.294 }, 00:24:24.294 { 00:24:24.294 "name": "pt3", 00:24:24.294 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:24.294 "is_configured": true, 00:24:24.294 "data_offset": 2048, 00:24:24.294 "data_size": 63488 00:24:24.294 }, 00:24:24.294 { 00:24:24.294 "name": null, 00:24:24.294 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:24.294 "is_configured": false, 00:24:24.294 "data_offset": 2048, 00:24:24.294 "data_size": 63488 00:24:24.294 } 00:24:24.295 ] 00:24:24.295 }' 00:24:24.295 05:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.295 05:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:24.861 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:24:24.861 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:24.861 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:24:24.861 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:25.119 [2024-07-26 05:52:39.811153] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:25.119 [2024-07-26 05:52:39.811196] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:25.119 [2024-07-26 05:52:39.811213] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2532520 00:24:25.119 [2024-07-26 05:52:39.811226] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:25.119 [2024-07-26 05:52:39.811552] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:25.119 [2024-07-26 05:52:39.811568] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:25.119 [2024-07-26 05:52:39.811629] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:24:25.119 [2024-07-26 05:52:39.811656] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:25.119 [2024-07-26 05:52:39.811766] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2386ea0 00:24:25.119 [2024-07-26 05:52:39.811776] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:25.119 [2024-07-26 05:52:39.811945] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x238b600 00:24:25.119 [2024-07-26 05:52:39.812075] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2386ea0 00:24:25.119 [2024-07-26 05:52:39.812085] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2386ea0 00:24:25.119 [2024-07-26 05:52:39.812177] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:25.120 pt4 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.120 05:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.378 05:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:25.378 "name": "raid_bdev1", 00:24:25.378 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:25.378 "strip_size_kb": 0, 00:24:25.378 "state": "online", 00:24:25.378 "raid_level": "raid1", 00:24:25.378 "superblock": true, 00:24:25.378 "num_base_bdevs": 4, 00:24:25.378 "num_base_bdevs_discovered": 3, 00:24:25.378 "num_base_bdevs_operational": 3, 00:24:25.378 "base_bdevs_list": [ 00:24:25.378 { 00:24:25.378 "name": null, 00:24:25.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.378 "is_configured": false, 00:24:25.378 "data_offset": 2048, 00:24:25.378 "data_size": 63488 00:24:25.378 }, 00:24:25.378 { 00:24:25.378 "name": "pt2", 00:24:25.378 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:25.378 "is_configured": true, 00:24:25.378 "data_offset": 2048, 00:24:25.378 "data_size": 63488 00:24:25.378 }, 00:24:25.378 { 00:24:25.378 "name": "pt3", 00:24:25.378 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:25.379 "is_configured": true, 00:24:25.379 "data_offset": 2048, 00:24:25.379 "data_size": 63488 00:24:25.379 }, 00:24:25.379 { 00:24:25.379 "name": "pt4", 00:24:25.379 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:25.379 "is_configured": true, 00:24:25.379 "data_offset": 2048, 00:24:25.379 "data_size": 63488 00:24:25.379 } 00:24:25.379 ] 00:24:25.379 }' 00:24:25.379 05:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:25.379 05:52:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:25.946 05:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:26.204 [2024-07-26 05:52:40.894189] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:26.204 [2024-07-26 05:52:40.894212] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:26.204 [2024-07-26 05:52:40.894261] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:26.204 [2024-07-26 05:52:40.894329] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:26.204 [2024-07-26 05:52:40.894346] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2386ea0 name raid_bdev1, state offline 00:24:26.204 05:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.204 05:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:24:26.463 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:24:26.463 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:24:26.463 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:24:26.463 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:24:26.463 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:26.722 [2024-07-26 05:52:41.559921] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:26.722 [2024-07-26 05:52:41.559965] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:26.722 [2024-07-26 05:52:41.559983] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2532520 00:24:26.722 [2024-07-26 05:52:41.559995] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:26.722 [2024-07-26 05:52:41.561592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:26.722 [2024-07-26 05:52:41.561619] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:26.722 [2024-07-26 05:52:41.561693] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:26.722 [2024-07-26 05:52:41.561720] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:26.722 [2024-07-26 05:52:41.561823] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:24:26.722 [2024-07-26 05:52:41.561835] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:26.722 [2024-07-26 05:52:41.561850] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2386060 name raid_bdev1, state configuring 00:24:26.722 [2024-07-26 05:52:41.561874] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:26.722 [2024-07-26 05:52:41.561950] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:26.722 pt1 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.722 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.981 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.981 "name": "raid_bdev1", 00:24:26.981 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:26.981 "strip_size_kb": 0, 00:24:26.981 "state": "configuring", 00:24:26.981 "raid_level": "raid1", 00:24:26.981 "superblock": true, 00:24:26.981 "num_base_bdevs": 4, 00:24:26.981 "num_base_bdevs_discovered": 2, 00:24:26.981 "num_base_bdevs_operational": 3, 00:24:26.981 "base_bdevs_list": [ 00:24:26.981 { 00:24:26.981 "name": null, 00:24:26.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.981 "is_configured": false, 00:24:26.981 "data_offset": 2048, 00:24:26.981 "data_size": 63488 00:24:26.981 }, 00:24:26.981 { 00:24:26.981 "name": "pt2", 00:24:26.981 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:26.981 "is_configured": true, 00:24:26.981 "data_offset": 2048, 00:24:26.981 "data_size": 63488 00:24:26.981 }, 00:24:26.981 { 00:24:26.981 "name": "pt3", 00:24:26.981 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:26.981 "is_configured": true, 00:24:26.981 "data_offset": 2048, 00:24:26.981 "data_size": 63488 00:24:26.981 }, 00:24:26.981 { 00:24:26.981 "name": null, 00:24:26.981 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:26.981 "is_configured": false, 00:24:26.981 "data_offset": 2048, 00:24:26.981 "data_size": 63488 00:24:26.981 } 00:24:26.981 ] 00:24:26.981 }' 00:24:26.981 05:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.981 05:52:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:27.549 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:24:27.549 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:27.808 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:24:27.808 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:28.067 [2024-07-26 05:52:42.915495] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:28.067 [2024-07-26 05:52:42.915541] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:28.067 [2024-07-26 05:52:42.915559] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2386310 00:24:28.067 [2024-07-26 05:52:42.915572] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:28.067 [2024-07-26 05:52:42.915920] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:28.067 [2024-07-26 05:52:42.915938] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:28.067 [2024-07-26 05:52:42.916000] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:24:28.067 [2024-07-26 05:52:42.916019] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:28.067 [2024-07-26 05:52:42.916129] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2389b40 00:24:28.067 [2024-07-26 05:52:42.916139] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:28.067 [2024-07-26 05:52:42.916309] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2529990 00:24:28.067 [2024-07-26 05:52:42.916436] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2389b40 00:24:28.067 [2024-07-26 05:52:42.916446] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2389b40 00:24:28.067 [2024-07-26 05:52:42.916540] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:28.067 pt4 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.067 05:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.326 05:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:28.326 "name": "raid_bdev1", 00:24:28.326 "uuid": "129e1f87-ddb0-400d-b52b-d18c7dcbfd1f", 00:24:28.326 "strip_size_kb": 0, 00:24:28.326 "state": "online", 00:24:28.326 "raid_level": "raid1", 00:24:28.326 "superblock": true, 00:24:28.326 "num_base_bdevs": 4, 00:24:28.326 "num_base_bdevs_discovered": 3, 00:24:28.326 "num_base_bdevs_operational": 3, 00:24:28.326 "base_bdevs_list": [ 00:24:28.326 { 00:24:28.326 "name": null, 00:24:28.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:28.326 "is_configured": false, 00:24:28.326 "data_offset": 2048, 00:24:28.326 "data_size": 63488 00:24:28.326 }, 00:24:28.326 { 00:24:28.326 "name": "pt2", 00:24:28.326 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:28.326 "is_configured": true, 00:24:28.326 "data_offset": 2048, 00:24:28.326 "data_size": 63488 00:24:28.326 }, 00:24:28.326 { 00:24:28.326 "name": "pt3", 00:24:28.326 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:28.326 "is_configured": true, 00:24:28.326 "data_offset": 2048, 00:24:28.326 "data_size": 63488 00:24:28.326 }, 00:24:28.326 { 00:24:28.326 "name": "pt4", 00:24:28.326 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:28.326 "is_configured": true, 00:24:28.326 "data_offset": 2048, 00:24:28.326 "data_size": 63488 00:24:28.326 } 00:24:28.326 ] 00:24:28.326 }' 00:24:28.326 05:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:28.326 05:52:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:28.893 05:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:24:28.893 05:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:29.152 05:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:24:29.152 05:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:29.152 05:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:24:29.410 [2024-07-26 05:52:44.255324] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:29.410 05:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 129e1f87-ddb0-400d-b52b-d18c7dcbfd1f '!=' 129e1f87-ddb0-400d-b52b-d18c7dcbfd1f ']' 00:24:29.410 05:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1227689 00:24:29.410 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1227689 ']' 00:24:29.410 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1227689 00:24:29.410 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:24:29.410 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:29.410 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1227689 00:24:29.668 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:29.668 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:29.668 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1227689' 00:24:29.668 killing process with pid 1227689 00:24:29.668 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1227689 00:24:29.668 [2024-07-26 05:52:44.326223] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:29.668 [2024-07-26 05:52:44.326282] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:29.668 [2024-07-26 05:52:44.326353] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:29.668 [2024-07-26 05:52:44.326373] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2389b40 name raid_bdev1, state offline 00:24:29.668 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1227689 00:24:29.668 [2024-07-26 05:52:44.368572] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:29.926 05:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:24:29.926 00:24:29.926 real 0m25.334s 00:24:29.926 user 0m46.402s 00:24:29.926 sys 0m4.502s 00:24:29.926 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:29.926 05:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:29.926 ************************************ 00:24:29.926 END TEST raid_superblock_test 00:24:29.926 ************************************ 00:24:29.926 05:52:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:29.926 05:52:44 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:24:29.926 05:52:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:24:29.926 05:52:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:29.926 05:52:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:29.926 ************************************ 00:24:29.926 START TEST raid_read_error_test 00:24:29.926 ************************************ 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.2Xz4vrRTy1 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1231506 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1231506 /var/tmp/spdk-raid.sock 00:24:29.926 05:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:24:29.927 05:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1231506 ']' 00:24:29.927 05:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:29.927 05:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:29.927 05:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:29.927 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:29.927 05:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:29.927 05:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:29.927 [2024-07-26 05:52:44.759492] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:24:29.927 [2024-07-26 05:52:44.759562] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1231506 ] 00:24:30.185 [2024-07-26 05:52:44.891396] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:30.185 [2024-07-26 05:52:44.988059] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:30.185 [2024-07-26 05:52:45.046451] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:30.185 [2024-07-26 05:52:45.046490] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:31.119 05:52:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:31.119 05:52:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:24:31.119 05:52:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:31.119 05:52:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:31.119 BaseBdev1_malloc 00:24:31.119 05:52:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:24:31.377 true 00:24:31.377 05:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:24:31.636 [2024-07-26 05:52:46.422674] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:24:31.636 [2024-07-26 05:52:46.422719] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:31.636 [2024-07-26 05:52:46.422739] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b20d0 00:24:31.636 [2024-07-26 05:52:46.422751] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:31.636 [2024-07-26 05:52:46.424540] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:31.636 [2024-07-26 05:52:46.424569] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:31.636 BaseBdev1 00:24:31.636 05:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:31.636 05:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:31.895 BaseBdev2_malloc 00:24:31.895 05:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:24:32.153 true 00:24:32.153 05:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:24:32.411 [2024-07-26 05:52:47.165313] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:24:32.411 [2024-07-26 05:52:47.165358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:32.411 [2024-07-26 05:52:47.165379] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b6910 00:24:32.411 [2024-07-26 05:52:47.165391] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:32.412 [2024-07-26 05:52:47.166974] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:32.412 [2024-07-26 05:52:47.167003] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:32.412 BaseBdev2 00:24:32.412 05:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:32.412 05:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:32.670 BaseBdev3_malloc 00:24:32.670 05:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:24:32.947 true 00:24:32.947 05:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:24:32.947 [2024-07-26 05:52:47.839761] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:24:32.947 [2024-07-26 05:52:47.839805] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:32.947 [2024-07-26 05:52:47.839824] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b8bd0 00:24:32.947 [2024-07-26 05:52:47.839836] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:32.947 [2024-07-26 05:52:47.841281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:32.947 [2024-07-26 05:52:47.841308] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:32.948 BaseBdev3 00:24:33.222 05:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:33.222 05:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:33.222 BaseBdev4_malloc 00:24:33.222 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:24:33.479 true 00:24:33.479 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:24:33.736 [2024-07-26 05:52:48.582295] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:24:33.736 [2024-07-26 05:52:48.582338] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:33.736 [2024-07-26 05:52:48.582358] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23b9aa0 00:24:33.736 [2024-07-26 05:52:48.582370] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:33.736 [2024-07-26 05:52:48.583874] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:33.736 [2024-07-26 05:52:48.583902] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:33.736 BaseBdev4 00:24:33.736 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:24:33.993 [2024-07-26 05:52:48.830977] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:33.993 [2024-07-26 05:52:48.832128] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:33.993 [2024-07-26 05:52:48.832193] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:33.993 [2024-07-26 05:52:48.832260] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:33.993 [2024-07-26 05:52:48.832487] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23b3c20 00:24:33.993 [2024-07-26 05:52:48.832498] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:33.993 [2024-07-26 05:52:48.832684] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2208260 00:24:33.993 [2024-07-26 05:52:48.832834] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23b3c20 00:24:33.993 [2024-07-26 05:52:48.832844] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23b3c20 00:24:33.993 [2024-07-26 05:52:48.832942] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.994 05:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.251 05:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.251 "name": "raid_bdev1", 00:24:34.251 "uuid": "0352c71d-acbd-41ed-81c5-e5dbfb8d3b3a", 00:24:34.251 "strip_size_kb": 0, 00:24:34.251 "state": "online", 00:24:34.251 "raid_level": "raid1", 00:24:34.251 "superblock": true, 00:24:34.251 "num_base_bdevs": 4, 00:24:34.251 "num_base_bdevs_discovered": 4, 00:24:34.251 "num_base_bdevs_operational": 4, 00:24:34.251 "base_bdevs_list": [ 00:24:34.252 { 00:24:34.252 "name": "BaseBdev1", 00:24:34.252 "uuid": "27881945-a70b-578c-9b2f-8e2d58bfc718", 00:24:34.252 "is_configured": true, 00:24:34.252 "data_offset": 2048, 00:24:34.252 "data_size": 63488 00:24:34.252 }, 00:24:34.252 { 00:24:34.252 "name": "BaseBdev2", 00:24:34.252 "uuid": "d950600d-3579-5ed9-b867-1d59ff2be99e", 00:24:34.252 "is_configured": true, 00:24:34.252 "data_offset": 2048, 00:24:34.252 "data_size": 63488 00:24:34.252 }, 00:24:34.252 { 00:24:34.252 "name": "BaseBdev3", 00:24:34.252 "uuid": "e0d917e8-5e00-5fb5-b843-174e1f9b3956", 00:24:34.252 "is_configured": true, 00:24:34.252 "data_offset": 2048, 00:24:34.252 "data_size": 63488 00:24:34.252 }, 00:24:34.252 { 00:24:34.252 "name": "BaseBdev4", 00:24:34.252 "uuid": "cdcff995-8882-5bcd-a526-6ef7bb4f871c", 00:24:34.252 "is_configured": true, 00:24:34.252 "data_offset": 2048, 00:24:34.252 "data_size": 63488 00:24:34.252 } 00:24:34.252 ] 00:24:34.252 }' 00:24:34.252 05:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.252 05:52:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:34.817 05:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:24:34.817 05:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:35.074 [2024-07-26 05:52:49.785793] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2207c60 00:24:36.007 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.265 05:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:36.523 05:52:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:36.523 "name": "raid_bdev1", 00:24:36.523 "uuid": "0352c71d-acbd-41ed-81c5-e5dbfb8d3b3a", 00:24:36.523 "strip_size_kb": 0, 00:24:36.523 "state": "online", 00:24:36.523 "raid_level": "raid1", 00:24:36.523 "superblock": true, 00:24:36.523 "num_base_bdevs": 4, 00:24:36.523 "num_base_bdevs_discovered": 4, 00:24:36.523 "num_base_bdevs_operational": 4, 00:24:36.523 "base_bdevs_list": [ 00:24:36.523 { 00:24:36.523 "name": "BaseBdev1", 00:24:36.523 "uuid": "27881945-a70b-578c-9b2f-8e2d58bfc718", 00:24:36.523 "is_configured": true, 00:24:36.523 "data_offset": 2048, 00:24:36.523 "data_size": 63488 00:24:36.523 }, 00:24:36.523 { 00:24:36.523 "name": "BaseBdev2", 00:24:36.523 "uuid": "d950600d-3579-5ed9-b867-1d59ff2be99e", 00:24:36.523 "is_configured": true, 00:24:36.523 "data_offset": 2048, 00:24:36.523 "data_size": 63488 00:24:36.523 }, 00:24:36.523 { 00:24:36.523 "name": "BaseBdev3", 00:24:36.523 "uuid": "e0d917e8-5e00-5fb5-b843-174e1f9b3956", 00:24:36.523 "is_configured": true, 00:24:36.523 "data_offset": 2048, 00:24:36.523 "data_size": 63488 00:24:36.523 }, 00:24:36.523 { 00:24:36.523 "name": "BaseBdev4", 00:24:36.523 "uuid": "cdcff995-8882-5bcd-a526-6ef7bb4f871c", 00:24:36.523 "is_configured": true, 00:24:36.523 "data_offset": 2048, 00:24:36.523 "data_size": 63488 00:24:36.523 } 00:24:36.523 ] 00:24:36.523 }' 00:24:36.523 05:52:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:36.523 05:52:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:37.088 05:52:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:37.347 [2024-07-26 05:52:51.998432] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:37.347 [2024-07-26 05:52:51.998466] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:37.347 [2024-07-26 05:52:52.001631] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:37.347 [2024-07-26 05:52:52.001676] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:37.347 [2024-07-26 05:52:52.001795] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:37.347 [2024-07-26 05:52:52.001807] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23b3c20 name raid_bdev1, state offline 00:24:37.347 0 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1231506 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1231506 ']' 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1231506 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1231506 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1231506' 00:24:37.347 killing process with pid 1231506 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1231506 00:24:37.347 [2024-07-26 05:52:52.065079] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:37.347 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1231506 00:24:37.347 [2024-07-26 05:52:52.096968] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.2Xz4vrRTy1 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:24:37.605 00:24:37.605 real 0m7.660s 00:24:37.605 user 0m12.285s 00:24:37.605 sys 0m1.364s 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:37.605 05:52:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:37.605 ************************************ 00:24:37.605 END TEST raid_read_error_test 00:24:37.605 ************************************ 00:24:37.605 05:52:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:37.605 05:52:52 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:24:37.605 05:52:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:24:37.605 05:52:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:37.605 05:52:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:37.605 ************************************ 00:24:37.605 START TEST raid_write_error_test 00:24:37.605 ************************************ 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:37.605 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.mzMBNdfJoK 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1232540 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1232540 /var/tmp/spdk-raid.sock 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1232540 ']' 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:37.606 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:37.606 05:52:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:37.606 [2024-07-26 05:52:52.488196] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:24:37.606 [2024-07-26 05:52:52.488264] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1232540 ] 00:24:37.862 [2024-07-26 05:52:52.616657] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:37.862 [2024-07-26 05:52:52.719932] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:38.119 [2024-07-26 05:52:52.781088] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:38.119 [2024-07-26 05:52:52.781118] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:38.682 05:52:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:38.682 05:52:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:24:38.682 05:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:38.682 05:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:38.939 BaseBdev1_malloc 00:24:38.939 05:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:24:38.939 true 00:24:39.196 05:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:24:39.196 [2024-07-26 05:52:54.079744] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:24:39.196 [2024-07-26 05:52:54.079793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.196 [2024-07-26 05:52:54.079814] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0d0d0 00:24:39.196 [2024-07-26 05:52:54.079827] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.196 [2024-07-26 05:52:54.081681] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.196 [2024-07-26 05:52:54.081710] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:39.196 BaseBdev1 00:24:39.196 05:52:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:39.196 05:52:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:39.452 BaseBdev2_malloc 00:24:39.452 05:52:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:24:39.710 true 00:24:39.710 05:52:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:24:39.967 [2024-07-26 05:52:54.806216] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:24:39.967 [2024-07-26 05:52:54.806259] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.967 [2024-07-26 05:52:54.806281] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e11910 00:24:39.968 [2024-07-26 05:52:54.806293] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.968 [2024-07-26 05:52:54.807864] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.968 [2024-07-26 05:52:54.807892] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:39.968 BaseBdev2 00:24:39.968 05:52:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:39.968 05:52:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:40.225 BaseBdev3_malloc 00:24:40.225 05:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:24:40.483 true 00:24:40.483 05:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:24:40.740 [2024-07-26 05:52:55.546147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:24:40.740 [2024-07-26 05:52:55.546193] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:40.740 [2024-07-26 05:52:55.546214] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e13bd0 00:24:40.740 [2024-07-26 05:52:55.546226] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:40.740 [2024-07-26 05:52:55.547819] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:40.740 [2024-07-26 05:52:55.547848] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:40.740 BaseBdev3 00:24:40.740 05:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:40.740 05:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:40.997 BaseBdev4_malloc 00:24:40.998 05:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:24:41.254 true 00:24:41.254 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:24:41.512 [2024-07-26 05:52:56.285734] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:24:41.512 [2024-07-26 05:52:56.285778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:41.512 [2024-07-26 05:52:56.285799] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e14aa0 00:24:41.512 [2024-07-26 05:52:56.285811] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:41.512 [2024-07-26 05:52:56.287365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:41.512 [2024-07-26 05:52:56.287392] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:41.512 BaseBdev4 00:24:41.512 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:24:41.770 [2024-07-26 05:52:56.530414] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:41.770 [2024-07-26 05:52:56.531755] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:41.770 [2024-07-26 05:52:56.531824] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:41.770 [2024-07-26 05:52:56.531884] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:41.770 [2024-07-26 05:52:56.532119] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e0ec20 00:24:41.770 [2024-07-26 05:52:56.532130] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:41.770 [2024-07-26 05:52:56.532326] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c63260 00:24:41.770 [2024-07-26 05:52:56.532482] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e0ec20 00:24:41.770 [2024-07-26 05:52:56.532492] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e0ec20 00:24:41.770 [2024-07-26 05:52:56.532598] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.770 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.028 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:42.028 "name": "raid_bdev1", 00:24:42.028 "uuid": "c7d9a627-930f-41b7-8888-7bad6b5c7211", 00:24:42.028 "strip_size_kb": 0, 00:24:42.028 "state": "online", 00:24:42.028 "raid_level": "raid1", 00:24:42.028 "superblock": true, 00:24:42.028 "num_base_bdevs": 4, 00:24:42.028 "num_base_bdevs_discovered": 4, 00:24:42.028 "num_base_bdevs_operational": 4, 00:24:42.028 "base_bdevs_list": [ 00:24:42.028 { 00:24:42.028 "name": "BaseBdev1", 00:24:42.028 "uuid": "1d000034-222f-5f81-a05f-2c9a1ea78229", 00:24:42.028 "is_configured": true, 00:24:42.028 "data_offset": 2048, 00:24:42.028 "data_size": 63488 00:24:42.028 }, 00:24:42.028 { 00:24:42.028 "name": "BaseBdev2", 00:24:42.028 "uuid": "52d45ce2-759d-5f60-a926-d0d568307e3f", 00:24:42.028 "is_configured": true, 00:24:42.028 "data_offset": 2048, 00:24:42.028 "data_size": 63488 00:24:42.028 }, 00:24:42.028 { 00:24:42.028 "name": "BaseBdev3", 00:24:42.028 "uuid": "1a60a36e-cbec-5eb9-95fd-639c331b21c0", 00:24:42.028 "is_configured": true, 00:24:42.028 "data_offset": 2048, 00:24:42.028 "data_size": 63488 00:24:42.028 }, 00:24:42.028 { 00:24:42.028 "name": "BaseBdev4", 00:24:42.028 "uuid": "1bd387e6-7216-54af-a546-efdaec9b8115", 00:24:42.028 "is_configured": true, 00:24:42.028 "data_offset": 2048, 00:24:42.028 "data_size": 63488 00:24:42.028 } 00:24:42.028 ] 00:24:42.028 }' 00:24:42.028 05:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:42.028 05:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:42.592 05:52:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:24:42.592 05:52:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:42.849 [2024-07-26 05:52:57.541374] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c62c60 00:24:43.781 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:24:43.781 [2024-07-26 05:52:58.665852] bdev_raid.c:2247:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:24:43.781 [2024-07-26 05:52:58.665912] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:43.781 [2024-07-26 05:52:58.666130] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c62c60 00:24:43.781 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:24:43.781 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:24:43.781 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:24:43.781 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:24:43.781 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:43.781 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:43.781 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.038 "name": "raid_bdev1", 00:24:44.038 "uuid": "c7d9a627-930f-41b7-8888-7bad6b5c7211", 00:24:44.038 "strip_size_kb": 0, 00:24:44.038 "state": "online", 00:24:44.038 "raid_level": "raid1", 00:24:44.038 "superblock": true, 00:24:44.038 "num_base_bdevs": 4, 00:24:44.038 "num_base_bdevs_discovered": 3, 00:24:44.038 "num_base_bdevs_operational": 3, 00:24:44.038 "base_bdevs_list": [ 00:24:44.038 { 00:24:44.038 "name": null, 00:24:44.038 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.038 "is_configured": false, 00:24:44.038 "data_offset": 2048, 00:24:44.038 "data_size": 63488 00:24:44.038 }, 00:24:44.038 { 00:24:44.038 "name": "BaseBdev2", 00:24:44.038 "uuid": "52d45ce2-759d-5f60-a926-d0d568307e3f", 00:24:44.038 "is_configured": true, 00:24:44.038 "data_offset": 2048, 00:24:44.038 "data_size": 63488 00:24:44.038 }, 00:24:44.038 { 00:24:44.038 "name": "BaseBdev3", 00:24:44.038 "uuid": "1a60a36e-cbec-5eb9-95fd-639c331b21c0", 00:24:44.038 "is_configured": true, 00:24:44.038 "data_offset": 2048, 00:24:44.038 "data_size": 63488 00:24:44.038 }, 00:24:44.038 { 00:24:44.038 "name": "BaseBdev4", 00:24:44.038 "uuid": "1bd387e6-7216-54af-a546-efdaec9b8115", 00:24:44.038 "is_configured": true, 00:24:44.038 "data_offset": 2048, 00:24:44.038 "data_size": 63488 00:24:44.038 } 00:24:44.038 ] 00:24:44.038 }' 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.038 05:52:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:44.970 [2024-07-26 05:52:59.776060] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:44.970 [2024-07-26 05:52:59.776096] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:44.970 [2024-07-26 05:52:59.779238] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:44.970 [2024-07-26 05:52:59.779274] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.970 [2024-07-26 05:52:59.779372] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:44.970 [2024-07-26 05:52:59.779384] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e0ec20 name raid_bdev1, state offline 00:24:44.970 0 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1232540 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1232540 ']' 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1232540 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1232540 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1232540' 00:24:44.970 killing process with pid 1232540 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1232540 00:24:44.970 [2024-07-26 05:52:59.842717] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:44.970 05:52:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1232540 00:24:44.970 [2024-07-26 05:52:59.874453] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.mzMBNdfJoK 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:24:45.228 00:24:45.228 real 0m7.683s 00:24:45.228 user 0m12.290s 00:24:45.228 sys 0m1.375s 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:45.228 05:53:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:45.228 ************************************ 00:24:45.228 END TEST raid_write_error_test 00:24:45.228 ************************************ 00:24:45.486 05:53:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:45.486 05:53:00 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:24:45.486 05:53:00 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:24:45.486 05:53:00 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:24:45.486 05:53:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:45.486 05:53:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:45.486 05:53:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:45.486 ************************************ 00:24:45.486 START TEST raid_rebuild_test 00:24:45.486 ************************************ 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1233681 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1233681 /var/tmp/spdk-raid.sock 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1233681 ']' 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:45.486 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:45.486 05:53:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:45.486 [2024-07-26 05:53:00.266409] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:24:45.486 [2024-07-26 05:53:00.266478] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1233681 ] 00:24:45.486 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:45.486 Zero copy mechanism will not be used. 00:24:45.744 [2024-07-26 05:53:00.395584] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:45.744 [2024-07-26 05:53:00.492882] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:45.744 [2024-07-26 05:53:00.554763] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:45.744 [2024-07-26 05:53:00.554804] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:46.309 05:53:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:46.309 05:53:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:24:46.309 05:53:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:46.309 05:53:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:46.566 BaseBdev1_malloc 00:24:46.566 05:53:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:46.824 [2024-07-26 05:53:01.683291] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:46.824 [2024-07-26 05:53:01.683336] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:46.824 [2024-07-26 05:53:01.683359] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f6d40 00:24:46.824 [2024-07-26 05:53:01.683372] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:46.824 [2024-07-26 05:53:01.684982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:46.824 [2024-07-26 05:53:01.685012] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:46.824 BaseBdev1 00:24:46.824 05:53:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:46.824 05:53:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:47.081 BaseBdev2_malloc 00:24:47.081 05:53:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:47.348 [2024-07-26 05:53:02.173388] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:47.348 [2024-07-26 05:53:02.173435] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:47.348 [2024-07-26 05:53:02.173458] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f7860 00:24:47.348 [2024-07-26 05:53:02.173471] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:47.348 [2024-07-26 05:53:02.174871] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:47.348 [2024-07-26 05:53:02.174900] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:47.348 BaseBdev2 00:24:47.348 05:53:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:47.623 spare_malloc 00:24:47.623 05:53:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:47.880 spare_delay 00:24:47.880 05:53:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:48.137 [2024-07-26 05:53:02.919976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:48.137 [2024-07-26 05:53:02.920019] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:48.137 [2024-07-26 05:53:02.920039] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a5ec0 00:24:48.137 [2024-07-26 05:53:02.920051] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:48.137 [2024-07-26 05:53:02.921494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:48.137 [2024-07-26 05:53:02.921528] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:48.137 spare 00:24:48.137 05:53:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:48.394 [2024-07-26 05:53:03.164658] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:48.394 [2024-07-26 05:53:03.165850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:48.394 [2024-07-26 05:53:03.165924] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x26a7070 00:24:48.394 [2024-07-26 05:53:03.165935] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:48.394 [2024-07-26 05:53:03.166135] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a0490 00:24:48.394 [2024-07-26 05:53:03.166270] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26a7070 00:24:48.394 [2024-07-26 05:53:03.166280] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26a7070 00:24:48.394 [2024-07-26 05:53:03.166384] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.394 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.657 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:48.657 "name": "raid_bdev1", 00:24:48.657 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:24:48.657 "strip_size_kb": 0, 00:24:48.657 "state": "online", 00:24:48.657 "raid_level": "raid1", 00:24:48.657 "superblock": false, 00:24:48.657 "num_base_bdevs": 2, 00:24:48.657 "num_base_bdevs_discovered": 2, 00:24:48.657 "num_base_bdevs_operational": 2, 00:24:48.657 "base_bdevs_list": [ 00:24:48.657 { 00:24:48.657 "name": "BaseBdev1", 00:24:48.657 "uuid": "ce356fc4-d6e3-5577-b0e6-9a9f17678ce8", 00:24:48.657 "is_configured": true, 00:24:48.657 "data_offset": 0, 00:24:48.657 "data_size": 65536 00:24:48.657 }, 00:24:48.657 { 00:24:48.657 "name": "BaseBdev2", 00:24:48.657 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:24:48.657 "is_configured": true, 00:24:48.657 "data_offset": 0, 00:24:48.657 "data_size": 65536 00:24:48.657 } 00:24:48.657 ] 00:24:48.657 }' 00:24:48.657 05:53:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:48.657 05:53:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:49.219 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:49.219 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:49.476 [2024-07-26 05:53:04.255918] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:49.476 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:49.476 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.476 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:49.733 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:49.990 [2024-07-26 05:53:04.749008] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a0490 00:24:49.990 /dev/nbd0 00:24:49.990 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:49.990 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:49.991 1+0 records in 00:24:49.991 1+0 records out 00:24:49.991 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288338 s, 14.2 MB/s 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:49.991 05:53:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:55.246 65536+0 records in 00:24:55.246 65536+0 records out 00:24:55.246 33554432 bytes (34 MB, 32 MiB) copied, 5.09768 s, 6.6 MB/s 00:24:55.246 05:53:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:55.246 05:53:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:55.246 05:53:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:55.246 05:53:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:55.247 05:53:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:55.247 05:53:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:55.247 05:53:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:55.247 05:53:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:55.247 [2024-07-26 05:53:10.105163] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:55.247 05:53:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:55.247 05:53:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:55.247 05:53:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:55.247 05:53:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:55.247 05:53:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:55.247 05:53:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:55.247 05:53:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:55.247 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:55.504 [2024-07-26 05:53:10.333785] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.504 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.761 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:55.761 "name": "raid_bdev1", 00:24:55.761 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:24:55.761 "strip_size_kb": 0, 00:24:55.761 "state": "online", 00:24:55.761 "raid_level": "raid1", 00:24:55.761 "superblock": false, 00:24:55.761 "num_base_bdevs": 2, 00:24:55.761 "num_base_bdevs_discovered": 1, 00:24:55.761 "num_base_bdevs_operational": 1, 00:24:55.761 "base_bdevs_list": [ 00:24:55.761 { 00:24:55.761 "name": null, 00:24:55.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.761 "is_configured": false, 00:24:55.761 "data_offset": 0, 00:24:55.761 "data_size": 65536 00:24:55.761 }, 00:24:55.761 { 00:24:55.761 "name": "BaseBdev2", 00:24:55.761 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:24:55.761 "is_configured": true, 00:24:55.761 "data_offset": 0, 00:24:55.761 "data_size": 65536 00:24:55.761 } 00:24:55.761 ] 00:24:55.761 }' 00:24:55.761 05:53:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:55.761 05:53:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:56.325 05:53:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:56.582 [2024-07-26 05:53:11.436709] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:56.582 [2024-07-26 05:53:11.441706] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a0490 00:24:56.582 [2024-07-26 05:53:11.443930] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:56.582 05:53:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:57.953 "name": "raid_bdev1", 00:24:57.953 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:24:57.953 "strip_size_kb": 0, 00:24:57.953 "state": "online", 00:24:57.953 "raid_level": "raid1", 00:24:57.953 "superblock": false, 00:24:57.953 "num_base_bdevs": 2, 00:24:57.953 "num_base_bdevs_discovered": 2, 00:24:57.953 "num_base_bdevs_operational": 2, 00:24:57.953 "process": { 00:24:57.953 "type": "rebuild", 00:24:57.953 "target": "spare", 00:24:57.953 "progress": { 00:24:57.953 "blocks": 24576, 00:24:57.953 "percent": 37 00:24:57.953 } 00:24:57.953 }, 00:24:57.953 "base_bdevs_list": [ 00:24:57.953 { 00:24:57.953 "name": "spare", 00:24:57.953 "uuid": "353516be-796c-52ac-b1a6-2fa1f4017d58", 00:24:57.953 "is_configured": true, 00:24:57.953 "data_offset": 0, 00:24:57.953 "data_size": 65536 00:24:57.953 }, 00:24:57.953 { 00:24:57.953 "name": "BaseBdev2", 00:24:57.953 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:24:57.953 "is_configured": true, 00:24:57.953 "data_offset": 0, 00:24:57.953 "data_size": 65536 00:24:57.953 } 00:24:57.953 ] 00:24:57.953 }' 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:57.953 05:53:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:58.211 [2024-07-26 05:53:13.025790] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:58.211 [2024-07-26 05:53:13.056709] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:58.211 [2024-07-26 05:53:13.056758] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:58.211 [2024-07-26 05:53:13.056774] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:58.211 [2024-07-26 05:53:13.056782] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.211 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.468 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:58.468 "name": "raid_bdev1", 00:24:58.468 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:24:58.468 "strip_size_kb": 0, 00:24:58.468 "state": "online", 00:24:58.468 "raid_level": "raid1", 00:24:58.468 "superblock": false, 00:24:58.468 "num_base_bdevs": 2, 00:24:58.468 "num_base_bdevs_discovered": 1, 00:24:58.468 "num_base_bdevs_operational": 1, 00:24:58.468 "base_bdevs_list": [ 00:24:58.468 { 00:24:58.468 "name": null, 00:24:58.468 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.468 "is_configured": false, 00:24:58.468 "data_offset": 0, 00:24:58.469 "data_size": 65536 00:24:58.469 }, 00:24:58.469 { 00:24:58.469 "name": "BaseBdev2", 00:24:58.469 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:24:58.469 "is_configured": true, 00:24:58.469 "data_offset": 0, 00:24:58.469 "data_size": 65536 00:24:58.469 } 00:24:58.469 ] 00:24:58.469 }' 00:24:58.469 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:58.469 05:53:13 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:59.031 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:59.031 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.031 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:59.031 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:59.031 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.031 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.031 05:53:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.289 05:53:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.289 "name": "raid_bdev1", 00:24:59.289 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:24:59.289 "strip_size_kb": 0, 00:24:59.289 "state": "online", 00:24:59.289 "raid_level": "raid1", 00:24:59.289 "superblock": false, 00:24:59.289 "num_base_bdevs": 2, 00:24:59.289 "num_base_bdevs_discovered": 1, 00:24:59.289 "num_base_bdevs_operational": 1, 00:24:59.289 "base_bdevs_list": [ 00:24:59.289 { 00:24:59.289 "name": null, 00:24:59.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.289 "is_configured": false, 00:24:59.289 "data_offset": 0, 00:24:59.289 "data_size": 65536 00:24:59.289 }, 00:24:59.289 { 00:24:59.289 "name": "BaseBdev2", 00:24:59.289 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:24:59.289 "is_configured": true, 00:24:59.289 "data_offset": 0, 00:24:59.289 "data_size": 65536 00:24:59.289 } 00:24:59.289 ] 00:24:59.289 }' 00:24:59.289 05:53:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.546 05:53:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:59.546 05:53:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.546 05:53:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:59.546 05:53:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:59.803 [2024-07-26 05:53:14.473619] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:59.803 [2024-07-26 05:53:14.478606] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a0490 00:24:59.803 [2024-07-26 05:53:14.480073] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:59.803 05:53:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:00.735 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.735 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.736 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.736 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.736 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.736 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.736 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:00.993 "name": "raid_bdev1", 00:25:00.993 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:25:00.993 "strip_size_kb": 0, 00:25:00.993 "state": "online", 00:25:00.993 "raid_level": "raid1", 00:25:00.993 "superblock": false, 00:25:00.993 "num_base_bdevs": 2, 00:25:00.993 "num_base_bdevs_discovered": 2, 00:25:00.993 "num_base_bdevs_operational": 2, 00:25:00.993 "process": { 00:25:00.993 "type": "rebuild", 00:25:00.993 "target": "spare", 00:25:00.993 "progress": { 00:25:00.993 "blocks": 24576, 00:25:00.993 "percent": 37 00:25:00.993 } 00:25:00.993 }, 00:25:00.993 "base_bdevs_list": [ 00:25:00.993 { 00:25:00.993 "name": "spare", 00:25:00.993 "uuid": "353516be-796c-52ac-b1a6-2fa1f4017d58", 00:25:00.993 "is_configured": true, 00:25:00.993 "data_offset": 0, 00:25:00.993 "data_size": 65536 00:25:00.993 }, 00:25:00.993 { 00:25:00.993 "name": "BaseBdev2", 00:25:00.993 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:25:00.993 "is_configured": true, 00:25:00.993 "data_offset": 0, 00:25:00.993 "data_size": 65536 00:25:00.993 } 00:25:00.993 ] 00:25:00.993 }' 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=765 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.993 05:53:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.250 05:53:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.250 "name": "raid_bdev1", 00:25:01.250 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:25:01.250 "strip_size_kb": 0, 00:25:01.250 "state": "online", 00:25:01.250 "raid_level": "raid1", 00:25:01.250 "superblock": false, 00:25:01.250 "num_base_bdevs": 2, 00:25:01.250 "num_base_bdevs_discovered": 2, 00:25:01.250 "num_base_bdevs_operational": 2, 00:25:01.250 "process": { 00:25:01.250 "type": "rebuild", 00:25:01.250 "target": "spare", 00:25:01.250 "progress": { 00:25:01.250 "blocks": 30720, 00:25:01.250 "percent": 46 00:25:01.250 } 00:25:01.251 }, 00:25:01.251 "base_bdevs_list": [ 00:25:01.251 { 00:25:01.251 "name": "spare", 00:25:01.251 "uuid": "353516be-796c-52ac-b1a6-2fa1f4017d58", 00:25:01.251 "is_configured": true, 00:25:01.251 "data_offset": 0, 00:25:01.251 "data_size": 65536 00:25:01.251 }, 00:25:01.251 { 00:25:01.251 "name": "BaseBdev2", 00:25:01.251 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:25:01.251 "is_configured": true, 00:25:01.251 "data_offset": 0, 00:25:01.251 "data_size": 65536 00:25:01.251 } 00:25:01.251 ] 00:25:01.251 }' 00:25:01.251 05:53:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.251 05:53:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.251 05:53:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.506 05:53:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.506 05:53:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:02.436 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:02.436 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:02.436 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:02.436 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:02.436 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:02.436 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:02.436 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.437 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.694 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:02.694 "name": "raid_bdev1", 00:25:02.694 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:25:02.694 "strip_size_kb": 0, 00:25:02.694 "state": "online", 00:25:02.694 "raid_level": "raid1", 00:25:02.694 "superblock": false, 00:25:02.694 "num_base_bdevs": 2, 00:25:02.694 "num_base_bdevs_discovered": 2, 00:25:02.694 "num_base_bdevs_operational": 2, 00:25:02.694 "process": { 00:25:02.694 "type": "rebuild", 00:25:02.694 "target": "spare", 00:25:02.694 "progress": { 00:25:02.694 "blocks": 59392, 00:25:02.694 "percent": 90 00:25:02.694 } 00:25:02.694 }, 00:25:02.694 "base_bdevs_list": [ 00:25:02.694 { 00:25:02.694 "name": "spare", 00:25:02.694 "uuid": "353516be-796c-52ac-b1a6-2fa1f4017d58", 00:25:02.694 "is_configured": true, 00:25:02.694 "data_offset": 0, 00:25:02.694 "data_size": 65536 00:25:02.694 }, 00:25:02.694 { 00:25:02.694 "name": "BaseBdev2", 00:25:02.694 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:25:02.694 "is_configured": true, 00:25:02.694 "data_offset": 0, 00:25:02.694 "data_size": 65536 00:25:02.694 } 00:25:02.694 ] 00:25:02.694 }' 00:25:02.694 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:02.694 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:02.694 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:02.694 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:02.694 05:53:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:02.951 [2024-07-26 05:53:17.705494] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:02.951 [2024-07-26 05:53:17.705555] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:02.951 [2024-07-26 05:53:17.705591] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.883 "name": "raid_bdev1", 00:25:03.883 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:25:03.883 "strip_size_kb": 0, 00:25:03.883 "state": "online", 00:25:03.883 "raid_level": "raid1", 00:25:03.883 "superblock": false, 00:25:03.883 "num_base_bdevs": 2, 00:25:03.883 "num_base_bdevs_discovered": 2, 00:25:03.883 "num_base_bdevs_operational": 2, 00:25:03.883 "base_bdevs_list": [ 00:25:03.883 { 00:25:03.883 "name": "spare", 00:25:03.883 "uuid": "353516be-796c-52ac-b1a6-2fa1f4017d58", 00:25:03.883 "is_configured": true, 00:25:03.883 "data_offset": 0, 00:25:03.883 "data_size": 65536 00:25:03.883 }, 00:25:03.883 { 00:25:03.883 "name": "BaseBdev2", 00:25:03.883 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:25:03.883 "is_configured": true, 00:25:03.883 "data_offset": 0, 00:25:03.883 "data_size": 65536 00:25:03.883 } 00:25:03.883 ] 00:25:03.883 }' 00:25:03.883 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.140 05:53:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:04.397 "name": "raid_bdev1", 00:25:04.397 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:25:04.397 "strip_size_kb": 0, 00:25:04.397 "state": "online", 00:25:04.397 "raid_level": "raid1", 00:25:04.397 "superblock": false, 00:25:04.397 "num_base_bdevs": 2, 00:25:04.397 "num_base_bdevs_discovered": 2, 00:25:04.397 "num_base_bdevs_operational": 2, 00:25:04.397 "base_bdevs_list": [ 00:25:04.397 { 00:25:04.397 "name": "spare", 00:25:04.397 "uuid": "353516be-796c-52ac-b1a6-2fa1f4017d58", 00:25:04.397 "is_configured": true, 00:25:04.397 "data_offset": 0, 00:25:04.397 "data_size": 65536 00:25:04.397 }, 00:25:04.397 { 00:25:04.397 "name": "BaseBdev2", 00:25:04.397 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:25:04.397 "is_configured": true, 00:25:04.397 "data_offset": 0, 00:25:04.397 "data_size": 65536 00:25:04.397 } 00:25:04.397 ] 00:25:04.397 }' 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.397 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.654 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.654 "name": "raid_bdev1", 00:25:04.654 "uuid": "1bf8cb7e-584a-4a56-b9d0-be431f2f1109", 00:25:04.654 "strip_size_kb": 0, 00:25:04.654 "state": "online", 00:25:04.654 "raid_level": "raid1", 00:25:04.654 "superblock": false, 00:25:04.654 "num_base_bdevs": 2, 00:25:04.654 "num_base_bdevs_discovered": 2, 00:25:04.654 "num_base_bdevs_operational": 2, 00:25:04.654 "base_bdevs_list": [ 00:25:04.654 { 00:25:04.654 "name": "spare", 00:25:04.654 "uuid": "353516be-796c-52ac-b1a6-2fa1f4017d58", 00:25:04.654 "is_configured": true, 00:25:04.654 "data_offset": 0, 00:25:04.654 "data_size": 65536 00:25:04.654 }, 00:25:04.654 { 00:25:04.654 "name": "BaseBdev2", 00:25:04.654 "uuid": "a622846a-6959-5352-898b-dfccbdc3870b", 00:25:04.654 "is_configured": true, 00:25:04.654 "data_offset": 0, 00:25:04.654 "data_size": 65536 00:25:04.654 } 00:25:04.654 ] 00:25:04.654 }' 00:25:04.654 05:53:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.654 05:53:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:05.253 05:53:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:05.510 [2024-07-26 05:53:20.261480] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:05.510 [2024-07-26 05:53:20.261510] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:05.510 [2024-07-26 05:53:20.261574] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:05.510 [2024-07-26 05:53:20.261634] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:05.510 [2024-07-26 05:53:20.261655] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26a7070 name raid_bdev1, state offline 00:25:05.510 05:53:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.510 05:53:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:05.767 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:06.024 /dev/nbd0 00:25:06.024 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:06.024 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:06.024 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:06.024 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:06.024 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:06.024 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:06.025 1+0 records in 00:25:06.025 1+0 records out 00:25:06.025 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263177 s, 15.6 MB/s 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:06.025 05:53:20 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:06.282 /dev/nbd1 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:06.282 1+0 records in 00:25:06.282 1+0 records out 00:25:06.282 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333679 s, 12.3 MB/s 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:06.282 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:06.540 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:06.798 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1233681 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1233681 ']' 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1233681 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:25:07.057 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:07.058 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1233681 00:25:07.058 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:07.058 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:07.058 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1233681' 00:25:07.058 killing process with pid 1233681 00:25:07.058 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1233681 00:25:07.058 Received shutdown signal, test time was about 60.000000 seconds 00:25:07.058 00:25:07.058 Latency(us) 00:25:07.058 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:07.058 =================================================================================================================== 00:25:07.058 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:07.058 [2024-07-26 05:53:21.761332] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:07.058 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1233681 00:25:07.058 [2024-07-26 05:53:21.787742] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:07.316 05:53:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:25:07.316 00:25:07.316 real 0m21.793s 00:25:07.316 user 0m29.559s 00:25:07.316 sys 0m4.697s 00:25:07.316 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:07.316 05:53:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:07.316 ************************************ 00:25:07.316 END TEST raid_rebuild_test 00:25:07.316 ************************************ 00:25:07.316 05:53:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:07.316 05:53:22 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:25:07.316 05:53:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:07.316 05:53:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:07.316 05:53:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:07.316 ************************************ 00:25:07.316 START TEST raid_rebuild_test_sb 00:25:07.316 ************************************ 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1236730 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1236730 /var/tmp/spdk-raid.sock 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1236730 ']' 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:07.316 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:07.316 05:53:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:07.316 [2024-07-26 05:53:22.146450] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:25:07.316 [2024-07-26 05:53:22.146516] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1236730 ] 00:25:07.316 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:07.316 Zero copy mechanism will not be used. 00:25:07.575 [2024-07-26 05:53:22.264499] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:07.575 [2024-07-26 05:53:22.371176] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:07.575 [2024-07-26 05:53:22.435969] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:07.575 [2024-07-26 05:53:22.436012] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:08.509 05:53:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:08.509 05:53:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:25:08.509 05:53:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:08.509 05:53:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:08.509 BaseBdev1_malloc 00:25:08.509 05:53:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:08.783 [2024-07-26 05:53:23.567542] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:08.783 [2024-07-26 05:53:23.567591] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.783 [2024-07-26 05:53:23.567616] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb4d40 00:25:08.783 [2024-07-26 05:53:23.567629] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.783 [2024-07-26 05:53:23.569350] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.783 [2024-07-26 05:53:23.569376] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:08.783 BaseBdev1 00:25:08.783 05:53:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:08.783 05:53:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:09.044 BaseBdev2_malloc 00:25:09.044 05:53:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:09.300 [2024-07-26 05:53:24.053780] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:09.300 [2024-07-26 05:53:24.053827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:09.300 [2024-07-26 05:53:24.053850] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eb5860 00:25:09.300 [2024-07-26 05:53:24.053863] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:09.300 [2024-07-26 05:53:24.055374] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:09.300 [2024-07-26 05:53:24.055402] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:09.300 BaseBdev2 00:25:09.301 05:53:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:09.558 spare_malloc 00:25:09.558 05:53:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:09.815 spare_delay 00:25:09.815 05:53:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:10.075 [2024-07-26 05:53:24.792316] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:10.075 [2024-07-26 05:53:24.792361] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:10.075 [2024-07-26 05:53:24.792381] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2063ec0 00:25:10.075 [2024-07-26 05:53:24.792394] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:10.075 [2024-07-26 05:53:24.793998] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:10.075 [2024-07-26 05:53:24.794025] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:10.075 spare 00:25:10.075 05:53:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:10.335 [2024-07-26 05:53:25.032982] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:10.335 [2024-07-26 05:53:25.034324] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:10.335 [2024-07-26 05:53:25.034497] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2065070 00:25:10.335 [2024-07-26 05:53:25.034510] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:10.335 [2024-07-26 05:53:25.034718] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x205e490 00:25:10.335 [2024-07-26 05:53:25.034860] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2065070 00:25:10.335 [2024-07-26 05:53:25.034870] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2065070 00:25:10.335 [2024-07-26 05:53:25.034972] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.335 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.595 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.595 "name": "raid_bdev1", 00:25:10.595 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:10.595 "strip_size_kb": 0, 00:25:10.595 "state": "online", 00:25:10.595 "raid_level": "raid1", 00:25:10.595 "superblock": true, 00:25:10.595 "num_base_bdevs": 2, 00:25:10.595 "num_base_bdevs_discovered": 2, 00:25:10.595 "num_base_bdevs_operational": 2, 00:25:10.595 "base_bdevs_list": [ 00:25:10.595 { 00:25:10.595 "name": "BaseBdev1", 00:25:10.595 "uuid": "e234881b-86fa-5127-814b-6f86647b0df7", 00:25:10.595 "is_configured": true, 00:25:10.595 "data_offset": 2048, 00:25:10.595 "data_size": 63488 00:25:10.595 }, 00:25:10.595 { 00:25:10.595 "name": "BaseBdev2", 00:25:10.595 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:10.595 "is_configured": true, 00:25:10.595 "data_offset": 2048, 00:25:10.595 "data_size": 63488 00:25:10.595 } 00:25:10.595 ] 00:25:10.595 }' 00:25:10.595 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.595 05:53:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:11.160 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:11.160 05:53:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:11.418 [2024-07-26 05:53:26.144141] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:11.418 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:11.418 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.418 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:11.675 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:11.932 [2024-07-26 05:53:26.641246] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x205e490 00:25:11.932 /dev/nbd0 00:25:11.932 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:11.932 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:11.932 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:11.932 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:11.933 1+0 records in 00:25:11.933 1+0 records out 00:25:11.933 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250112 s, 16.4 MB/s 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:11.933 05:53:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:17.209 63488+0 records in 00:25:17.209 63488+0 records out 00:25:17.209 32505856 bytes (33 MB, 31 MiB) copied, 5.16747 s, 6.3 MB/s 00:25:17.209 05:53:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:17.209 05:53:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:17.209 05:53:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:17.209 05:53:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:17.209 05:53:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:17.209 05:53:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:17.209 05:53:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:17.467 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:17.467 [2024-07-26 05:53:32.125468] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:17.467 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:17.467 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:17.467 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:17.467 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:17.467 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:17.467 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:17.467 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:17.467 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:17.467 [2024-07-26 05:53:32.358134] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.725 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.983 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:17.983 "name": "raid_bdev1", 00:25:17.983 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:17.983 "strip_size_kb": 0, 00:25:17.983 "state": "online", 00:25:17.983 "raid_level": "raid1", 00:25:17.983 "superblock": true, 00:25:17.983 "num_base_bdevs": 2, 00:25:17.983 "num_base_bdevs_discovered": 1, 00:25:17.983 "num_base_bdevs_operational": 1, 00:25:17.983 "base_bdevs_list": [ 00:25:17.983 { 00:25:17.983 "name": null, 00:25:17.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.983 "is_configured": false, 00:25:17.983 "data_offset": 2048, 00:25:17.983 "data_size": 63488 00:25:17.983 }, 00:25:17.983 { 00:25:17.983 "name": "BaseBdev2", 00:25:17.983 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:17.983 "is_configured": true, 00:25:17.983 "data_offset": 2048, 00:25:17.983 "data_size": 63488 00:25:17.983 } 00:25:17.983 ] 00:25:17.983 }' 00:25:17.983 05:53:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:17.983 05:53:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:18.549 05:53:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:18.549 [2024-07-26 05:53:33.449019] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:18.549 [2024-07-26 05:53:33.453954] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x205e490 00:25:18.549 [2024-07-26 05:53:33.456147] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:18.805 05:53:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:19.739 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:19.739 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:19.739 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:19.739 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:19.740 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:19.740 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.740 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.998 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:19.998 "name": "raid_bdev1", 00:25:19.998 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:19.998 "strip_size_kb": 0, 00:25:19.998 "state": "online", 00:25:19.998 "raid_level": "raid1", 00:25:19.998 "superblock": true, 00:25:19.998 "num_base_bdevs": 2, 00:25:19.998 "num_base_bdevs_discovered": 2, 00:25:19.998 "num_base_bdevs_operational": 2, 00:25:19.998 "process": { 00:25:19.998 "type": "rebuild", 00:25:19.998 "target": "spare", 00:25:19.998 "progress": { 00:25:19.998 "blocks": 24576, 00:25:19.998 "percent": 38 00:25:19.998 } 00:25:19.998 }, 00:25:19.998 "base_bdevs_list": [ 00:25:19.998 { 00:25:19.998 "name": "spare", 00:25:19.998 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:19.998 "is_configured": true, 00:25:19.998 "data_offset": 2048, 00:25:19.998 "data_size": 63488 00:25:19.998 }, 00:25:19.998 { 00:25:19.998 "name": "BaseBdev2", 00:25:19.998 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:19.998 "is_configured": true, 00:25:19.998 "data_offset": 2048, 00:25:19.998 "data_size": 63488 00:25:19.998 } 00:25:19.998 ] 00:25:19.998 }' 00:25:19.998 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:19.998 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:19.998 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:19.998 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:19.998 05:53:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:20.256 [2024-07-26 05:53:35.026838] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:20.256 [2024-07-26 05:53:35.068391] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:20.256 [2024-07-26 05:53:35.068438] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:20.256 [2024-07-26 05:53:35.068453] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:20.256 [2024-07-26 05:53:35.068462] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.256 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.514 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:20.514 "name": "raid_bdev1", 00:25:20.514 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:20.514 "strip_size_kb": 0, 00:25:20.514 "state": "online", 00:25:20.514 "raid_level": "raid1", 00:25:20.514 "superblock": true, 00:25:20.514 "num_base_bdevs": 2, 00:25:20.514 "num_base_bdevs_discovered": 1, 00:25:20.514 "num_base_bdevs_operational": 1, 00:25:20.514 "base_bdevs_list": [ 00:25:20.514 { 00:25:20.514 "name": null, 00:25:20.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.514 "is_configured": false, 00:25:20.514 "data_offset": 2048, 00:25:20.514 "data_size": 63488 00:25:20.514 }, 00:25:20.514 { 00:25:20.514 "name": "BaseBdev2", 00:25:20.514 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:20.514 "is_configured": true, 00:25:20.514 "data_offset": 2048, 00:25:20.514 "data_size": 63488 00:25:20.514 } 00:25:20.514 ] 00:25:20.514 }' 00:25:20.514 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:20.514 05:53:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:21.080 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:21.081 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:21.081 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:21.081 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:21.081 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:21.081 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.081 05:53:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.339 05:53:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:21.339 "name": "raid_bdev1", 00:25:21.339 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:21.339 "strip_size_kb": 0, 00:25:21.339 "state": "online", 00:25:21.339 "raid_level": "raid1", 00:25:21.339 "superblock": true, 00:25:21.339 "num_base_bdevs": 2, 00:25:21.339 "num_base_bdevs_discovered": 1, 00:25:21.339 "num_base_bdevs_operational": 1, 00:25:21.339 "base_bdevs_list": [ 00:25:21.339 { 00:25:21.339 "name": null, 00:25:21.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:21.339 "is_configured": false, 00:25:21.339 "data_offset": 2048, 00:25:21.339 "data_size": 63488 00:25:21.339 }, 00:25:21.339 { 00:25:21.339 "name": "BaseBdev2", 00:25:21.339 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:21.339 "is_configured": true, 00:25:21.339 "data_offset": 2048, 00:25:21.339 "data_size": 63488 00:25:21.339 } 00:25:21.339 ] 00:25:21.339 }' 00:25:21.339 05:53:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:21.597 05:53:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:21.597 05:53:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:21.597 05:53:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:21.597 05:53:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:21.875 [2024-07-26 05:53:36.517343] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:21.875 [2024-07-26 05:53:36.522257] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x205e490 00:25:21.875 [2024-07-26 05:53:36.523721] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:21.875 05:53:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:22.830 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:22.830 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:22.830 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:22.830 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:22.830 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:22.830 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.830 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:23.090 "name": "raid_bdev1", 00:25:23.090 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:23.090 "strip_size_kb": 0, 00:25:23.090 "state": "online", 00:25:23.090 "raid_level": "raid1", 00:25:23.090 "superblock": true, 00:25:23.090 "num_base_bdevs": 2, 00:25:23.090 "num_base_bdevs_discovered": 2, 00:25:23.090 "num_base_bdevs_operational": 2, 00:25:23.090 "process": { 00:25:23.090 "type": "rebuild", 00:25:23.090 "target": "spare", 00:25:23.090 "progress": { 00:25:23.090 "blocks": 24576, 00:25:23.090 "percent": 38 00:25:23.090 } 00:25:23.090 }, 00:25:23.090 "base_bdevs_list": [ 00:25:23.090 { 00:25:23.090 "name": "spare", 00:25:23.090 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:23.090 "is_configured": true, 00:25:23.090 "data_offset": 2048, 00:25:23.090 "data_size": 63488 00:25:23.090 }, 00:25:23.090 { 00:25:23.090 "name": "BaseBdev2", 00:25:23.090 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:23.090 "is_configured": true, 00:25:23.090 "data_offset": 2048, 00:25:23.090 "data_size": 63488 00:25:23.090 } 00:25:23.090 ] 00:25:23.090 }' 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:23.090 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=787 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.090 05:53:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.349 05:53:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:23.349 "name": "raid_bdev1", 00:25:23.349 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:23.349 "strip_size_kb": 0, 00:25:23.349 "state": "online", 00:25:23.349 "raid_level": "raid1", 00:25:23.349 "superblock": true, 00:25:23.349 "num_base_bdevs": 2, 00:25:23.349 "num_base_bdevs_discovered": 2, 00:25:23.349 "num_base_bdevs_operational": 2, 00:25:23.349 "process": { 00:25:23.349 "type": "rebuild", 00:25:23.349 "target": "spare", 00:25:23.349 "progress": { 00:25:23.349 "blocks": 30720, 00:25:23.349 "percent": 48 00:25:23.349 } 00:25:23.349 }, 00:25:23.349 "base_bdevs_list": [ 00:25:23.349 { 00:25:23.349 "name": "spare", 00:25:23.349 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:23.349 "is_configured": true, 00:25:23.349 "data_offset": 2048, 00:25:23.349 "data_size": 63488 00:25:23.349 }, 00:25:23.349 { 00:25:23.349 "name": "BaseBdev2", 00:25:23.349 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:23.349 "is_configured": true, 00:25:23.349 "data_offset": 2048, 00:25:23.349 "data_size": 63488 00:25:23.349 } 00:25:23.349 ] 00:25:23.349 }' 00:25:23.349 05:53:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:23.349 05:53:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:23.350 05:53:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:23.350 05:53:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:23.350 05:53:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:24.728 "name": "raid_bdev1", 00:25:24.728 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:24.728 "strip_size_kb": 0, 00:25:24.728 "state": "online", 00:25:24.728 "raid_level": "raid1", 00:25:24.728 "superblock": true, 00:25:24.728 "num_base_bdevs": 2, 00:25:24.728 "num_base_bdevs_discovered": 2, 00:25:24.728 "num_base_bdevs_operational": 2, 00:25:24.728 "process": { 00:25:24.728 "type": "rebuild", 00:25:24.728 "target": "spare", 00:25:24.728 "progress": { 00:25:24.728 "blocks": 59392, 00:25:24.728 "percent": 93 00:25:24.728 } 00:25:24.728 }, 00:25:24.728 "base_bdevs_list": [ 00:25:24.728 { 00:25:24.728 "name": "spare", 00:25:24.728 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:24.728 "is_configured": true, 00:25:24.728 "data_offset": 2048, 00:25:24.728 "data_size": 63488 00:25:24.728 }, 00:25:24.728 { 00:25:24.728 "name": "BaseBdev2", 00:25:24.728 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:24.728 "is_configured": true, 00:25:24.728 "data_offset": 2048, 00:25:24.728 "data_size": 63488 00:25:24.728 } 00:25:24.728 ] 00:25:24.728 }' 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:24.728 05:53:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:24.987 [2024-07-26 05:53:39.648073] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:24.987 [2024-07-26 05:53:39.648140] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:24.987 [2024-07-26 05:53:39.648222] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:25.925 "name": "raid_bdev1", 00:25:25.925 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:25.925 "strip_size_kb": 0, 00:25:25.925 "state": "online", 00:25:25.925 "raid_level": "raid1", 00:25:25.925 "superblock": true, 00:25:25.925 "num_base_bdevs": 2, 00:25:25.925 "num_base_bdevs_discovered": 2, 00:25:25.925 "num_base_bdevs_operational": 2, 00:25:25.925 "base_bdevs_list": [ 00:25:25.925 { 00:25:25.925 "name": "spare", 00:25:25.925 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:25.925 "is_configured": true, 00:25:25.925 "data_offset": 2048, 00:25:25.925 "data_size": 63488 00:25:25.925 }, 00:25:25.925 { 00:25:25.925 "name": "BaseBdev2", 00:25:25.925 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:25.925 "is_configured": true, 00:25:25.925 "data_offset": 2048, 00:25:25.925 "data_size": 63488 00:25:25.925 } 00:25:25.925 ] 00:25:25.925 }' 00:25:25.925 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.184 05:53:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:26.443 "name": "raid_bdev1", 00:25:26.443 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:26.443 "strip_size_kb": 0, 00:25:26.443 "state": "online", 00:25:26.443 "raid_level": "raid1", 00:25:26.443 "superblock": true, 00:25:26.443 "num_base_bdevs": 2, 00:25:26.443 "num_base_bdevs_discovered": 2, 00:25:26.443 "num_base_bdevs_operational": 2, 00:25:26.443 "base_bdevs_list": [ 00:25:26.443 { 00:25:26.443 "name": "spare", 00:25:26.443 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:26.443 "is_configured": true, 00:25:26.443 "data_offset": 2048, 00:25:26.443 "data_size": 63488 00:25:26.443 }, 00:25:26.443 { 00:25:26.443 "name": "BaseBdev2", 00:25:26.443 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:26.443 "is_configured": true, 00:25:26.443 "data_offset": 2048, 00:25:26.443 "data_size": 63488 00:25:26.443 } 00:25:26.443 ] 00:25:26.443 }' 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.443 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.703 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.703 "name": "raid_bdev1", 00:25:26.703 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:26.703 "strip_size_kb": 0, 00:25:26.703 "state": "online", 00:25:26.703 "raid_level": "raid1", 00:25:26.703 "superblock": true, 00:25:26.703 "num_base_bdevs": 2, 00:25:26.703 "num_base_bdevs_discovered": 2, 00:25:26.703 "num_base_bdevs_operational": 2, 00:25:26.703 "base_bdevs_list": [ 00:25:26.703 { 00:25:26.703 "name": "spare", 00:25:26.703 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:26.703 "is_configured": true, 00:25:26.703 "data_offset": 2048, 00:25:26.703 "data_size": 63488 00:25:26.703 }, 00:25:26.703 { 00:25:26.703 "name": "BaseBdev2", 00:25:26.703 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:26.703 "is_configured": true, 00:25:26.703 "data_offset": 2048, 00:25:26.703 "data_size": 63488 00:25:26.703 } 00:25:26.703 ] 00:25:26.703 }' 00:25:26.703 05:53:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.703 05:53:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:27.270 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:27.529 [2024-07-26 05:53:42.268620] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:27.529 [2024-07-26 05:53:42.268653] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:27.529 [2024-07-26 05:53:42.268712] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:27.529 [2024-07-26 05:53:42.268769] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:27.529 [2024-07-26 05:53:42.268781] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2065070 name raid_bdev1, state offline 00:25:27.529 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.529 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:27.788 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:28.047 /dev/nbd0 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:28.047 1+0 records in 00:25:28.047 1+0 records out 00:25:28.047 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250374 s, 16.4 MB/s 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.047 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:28.048 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.048 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:28.048 05:53:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:28.048 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:28.048 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:28.048 05:53:42 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:28.307 /dev/nbd1 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:28.307 1+0 records in 00:25:28.307 1+0 records out 00:25:28.307 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324068 s, 12.6 MB/s 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:28.307 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:28.568 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:28.833 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:29.092 05:53:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:29.351 [2024-07-26 05:53:44.083808] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:29.351 [2024-07-26 05:53:44.083853] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.351 [2024-07-26 05:53:44.083878] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fd0b00 00:25:29.351 [2024-07-26 05:53:44.083891] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.351 [2024-07-26 05:53:44.085515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.351 [2024-07-26 05:53:44.085544] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:29.351 [2024-07-26 05:53:44.085623] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:29.351 [2024-07-26 05:53:44.085657] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:29.352 [2024-07-26 05:53:44.085758] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:29.352 spare 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.352 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.352 [2024-07-26 05:53:44.186072] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2062f40 00:25:29.352 [2024-07-26 05:53:44.186088] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:29.352 [2024-07-26 05:53:44.186280] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x205e490 00:25:29.352 [2024-07-26 05:53:44.186436] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2062f40 00:25:29.352 [2024-07-26 05:53:44.186449] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2062f40 00:25:29.352 [2024-07-26 05:53:44.186559] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:29.611 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:29.611 "name": "raid_bdev1", 00:25:29.611 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:29.611 "strip_size_kb": 0, 00:25:29.611 "state": "online", 00:25:29.611 "raid_level": "raid1", 00:25:29.611 "superblock": true, 00:25:29.611 "num_base_bdevs": 2, 00:25:29.611 "num_base_bdevs_discovered": 2, 00:25:29.611 "num_base_bdevs_operational": 2, 00:25:29.611 "base_bdevs_list": [ 00:25:29.611 { 00:25:29.611 "name": "spare", 00:25:29.611 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:29.611 "is_configured": true, 00:25:29.611 "data_offset": 2048, 00:25:29.611 "data_size": 63488 00:25:29.611 }, 00:25:29.611 { 00:25:29.611 "name": "BaseBdev2", 00:25:29.611 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:29.611 "is_configured": true, 00:25:29.611 "data_offset": 2048, 00:25:29.611 "data_size": 63488 00:25:29.611 } 00:25:29.611 ] 00:25:29.611 }' 00:25:29.611 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:29.611 05:53:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:30.179 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:30.179 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:30.179 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:30.179 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:30.179 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:30.179 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.179 05:53:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.439 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:30.439 "name": "raid_bdev1", 00:25:30.439 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:30.439 "strip_size_kb": 0, 00:25:30.439 "state": "online", 00:25:30.439 "raid_level": "raid1", 00:25:30.439 "superblock": true, 00:25:30.439 "num_base_bdevs": 2, 00:25:30.439 "num_base_bdevs_discovered": 2, 00:25:30.439 "num_base_bdevs_operational": 2, 00:25:30.439 "base_bdevs_list": [ 00:25:30.439 { 00:25:30.439 "name": "spare", 00:25:30.439 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:30.439 "is_configured": true, 00:25:30.439 "data_offset": 2048, 00:25:30.439 "data_size": 63488 00:25:30.439 }, 00:25:30.439 { 00:25:30.439 "name": "BaseBdev2", 00:25:30.439 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:30.439 "is_configured": true, 00:25:30.439 "data_offset": 2048, 00:25:30.439 "data_size": 63488 00:25:30.439 } 00:25:30.439 ] 00:25:30.439 }' 00:25:30.439 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:30.439 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:30.439 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:30.439 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:30.439 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.439 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:30.698 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:30.698 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:30.957 [2024-07-26 05:53:45.700191] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.957 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.217 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:31.217 "name": "raid_bdev1", 00:25:31.217 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:31.217 "strip_size_kb": 0, 00:25:31.217 "state": "online", 00:25:31.217 "raid_level": "raid1", 00:25:31.217 "superblock": true, 00:25:31.217 "num_base_bdevs": 2, 00:25:31.217 "num_base_bdevs_discovered": 1, 00:25:31.217 "num_base_bdevs_operational": 1, 00:25:31.217 "base_bdevs_list": [ 00:25:31.217 { 00:25:31.217 "name": null, 00:25:31.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.217 "is_configured": false, 00:25:31.217 "data_offset": 2048, 00:25:31.217 "data_size": 63488 00:25:31.217 }, 00:25:31.217 { 00:25:31.217 "name": "BaseBdev2", 00:25:31.217 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:31.217 "is_configured": true, 00:25:31.217 "data_offset": 2048, 00:25:31.217 "data_size": 63488 00:25:31.217 } 00:25:31.217 ] 00:25:31.217 }' 00:25:31.217 05:53:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:31.217 05:53:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:31.783 05:53:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:32.043 [2024-07-26 05:53:46.738976] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:32.043 [2024-07-26 05:53:46.739123] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:32.043 [2024-07-26 05:53:46.739140] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:32.043 [2024-07-26 05:53:46.739167] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:32.043 [2024-07-26 05:53:46.744006] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x205e490 00:25:32.043 [2024-07-26 05:53:46.746315] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:32.043 05:53:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:32.979 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:32.979 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:32.979 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:32.979 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:32.979 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:32.979 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.979 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.238 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.238 "name": "raid_bdev1", 00:25:33.238 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:33.238 "strip_size_kb": 0, 00:25:33.238 "state": "online", 00:25:33.238 "raid_level": "raid1", 00:25:33.238 "superblock": true, 00:25:33.238 "num_base_bdevs": 2, 00:25:33.238 "num_base_bdevs_discovered": 2, 00:25:33.238 "num_base_bdevs_operational": 2, 00:25:33.238 "process": { 00:25:33.238 "type": "rebuild", 00:25:33.238 "target": "spare", 00:25:33.238 "progress": { 00:25:33.238 "blocks": 22528, 00:25:33.238 "percent": 35 00:25:33.238 } 00:25:33.238 }, 00:25:33.238 "base_bdevs_list": [ 00:25:33.238 { 00:25:33.238 "name": "spare", 00:25:33.238 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:33.238 "is_configured": true, 00:25:33.238 "data_offset": 2048, 00:25:33.238 "data_size": 63488 00:25:33.238 }, 00:25:33.238 { 00:25:33.238 "name": "BaseBdev2", 00:25:33.238 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:33.238 "is_configured": true, 00:25:33.238 "data_offset": 2048, 00:25:33.238 "data_size": 63488 00:25:33.238 } 00:25:33.238 ] 00:25:33.238 }' 00:25:33.238 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:33.238 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:33.238 05:53:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:33.238 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:33.238 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:33.497 [2024-07-26 05:53:48.264310] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:33.497 [2024-07-26 05:53:48.358523] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:33.497 [2024-07-26 05:53:48.358572] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:33.497 [2024-07-26 05:53:48.358588] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:33.497 [2024-07-26 05:53:48.358596] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.497 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.756 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.756 "name": "raid_bdev1", 00:25:33.756 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:33.756 "strip_size_kb": 0, 00:25:33.756 "state": "online", 00:25:33.756 "raid_level": "raid1", 00:25:33.756 "superblock": true, 00:25:33.756 "num_base_bdevs": 2, 00:25:33.756 "num_base_bdevs_discovered": 1, 00:25:33.756 "num_base_bdevs_operational": 1, 00:25:33.756 "base_bdevs_list": [ 00:25:33.756 { 00:25:33.756 "name": null, 00:25:33.756 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.756 "is_configured": false, 00:25:33.756 "data_offset": 2048, 00:25:33.756 "data_size": 63488 00:25:33.756 }, 00:25:33.756 { 00:25:33.756 "name": "BaseBdev2", 00:25:33.756 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:33.756 "is_configured": true, 00:25:33.756 "data_offset": 2048, 00:25:33.756 "data_size": 63488 00:25:33.756 } 00:25:33.756 ] 00:25:33.756 }' 00:25:33.756 05:53:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.756 05:53:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:34.693 05:53:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:34.693 [2024-07-26 05:53:49.458423] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:34.693 [2024-07-26 05:53:49.458473] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:34.693 [2024-07-26 05:53:49.458498] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20632c0 00:25:34.693 [2024-07-26 05:53:49.458510] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:34.693 [2024-07-26 05:53:49.458880] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:34.693 [2024-07-26 05:53:49.458899] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:34.693 [2024-07-26 05:53:49.458978] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:34.693 [2024-07-26 05:53:49.458990] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:34.693 [2024-07-26 05:53:49.459001] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:34.693 [2024-07-26 05:53:49.459020] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:34.693 [2024-07-26 05:53:49.463839] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20673d0 00:25:34.693 spare 00:25:34.693 [2024-07-26 05:53:49.465289] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:34.693 05:53:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:35.629 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:35.629 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.629 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:35.629 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:35.629 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.629 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.629 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.887 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.887 "name": "raid_bdev1", 00:25:35.887 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:35.887 "strip_size_kb": 0, 00:25:35.887 "state": "online", 00:25:35.887 "raid_level": "raid1", 00:25:35.887 "superblock": true, 00:25:35.887 "num_base_bdevs": 2, 00:25:35.887 "num_base_bdevs_discovered": 2, 00:25:35.887 "num_base_bdevs_operational": 2, 00:25:35.887 "process": { 00:25:35.887 "type": "rebuild", 00:25:35.887 "target": "spare", 00:25:35.887 "progress": { 00:25:35.887 "blocks": 24576, 00:25:35.887 "percent": 38 00:25:35.887 } 00:25:35.887 }, 00:25:35.887 "base_bdevs_list": [ 00:25:35.887 { 00:25:35.887 "name": "spare", 00:25:35.887 "uuid": "4e9e156f-1745-54fd-bcae-211b7c68285c", 00:25:35.887 "is_configured": true, 00:25:35.887 "data_offset": 2048, 00:25:35.888 "data_size": 63488 00:25:35.888 }, 00:25:35.888 { 00:25:35.888 "name": "BaseBdev2", 00:25:35.888 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:35.888 "is_configured": true, 00:25:35.888 "data_offset": 2048, 00:25:35.888 "data_size": 63488 00:25:35.888 } 00:25:35.888 ] 00:25:35.888 }' 00:25:35.888 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.888 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:35.888 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:36.146 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:36.146 05:53:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:36.405 [2024-07-26 05:53:51.060309] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.405 [2024-07-26 05:53:51.077621] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:36.405 [2024-07-26 05:53:51.077670] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:36.405 [2024-07-26 05:53:51.077686] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.405 [2024-07-26 05:53:51.077694] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.405 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.664 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.664 "name": "raid_bdev1", 00:25:36.664 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:36.664 "strip_size_kb": 0, 00:25:36.664 "state": "online", 00:25:36.664 "raid_level": "raid1", 00:25:36.664 "superblock": true, 00:25:36.664 "num_base_bdevs": 2, 00:25:36.664 "num_base_bdevs_discovered": 1, 00:25:36.664 "num_base_bdevs_operational": 1, 00:25:36.664 "base_bdevs_list": [ 00:25:36.664 { 00:25:36.664 "name": null, 00:25:36.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.664 "is_configured": false, 00:25:36.664 "data_offset": 2048, 00:25:36.664 "data_size": 63488 00:25:36.664 }, 00:25:36.664 { 00:25:36.664 "name": "BaseBdev2", 00:25:36.664 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:36.664 "is_configured": true, 00:25:36.664 "data_offset": 2048, 00:25:36.664 "data_size": 63488 00:25:36.664 } 00:25:36.664 ] 00:25:36.664 }' 00:25:36.664 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.664 05:53:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:37.252 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:37.252 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:37.252 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:37.252 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:37.252 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:37.252 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.252 05:53:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.512 05:53:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:37.512 "name": "raid_bdev1", 00:25:37.512 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:37.512 "strip_size_kb": 0, 00:25:37.512 "state": "online", 00:25:37.512 "raid_level": "raid1", 00:25:37.512 "superblock": true, 00:25:37.512 "num_base_bdevs": 2, 00:25:37.512 "num_base_bdevs_discovered": 1, 00:25:37.512 "num_base_bdevs_operational": 1, 00:25:37.512 "base_bdevs_list": [ 00:25:37.512 { 00:25:37.512 "name": null, 00:25:37.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:37.512 "is_configured": false, 00:25:37.512 "data_offset": 2048, 00:25:37.512 "data_size": 63488 00:25:37.512 }, 00:25:37.512 { 00:25:37.512 "name": "BaseBdev2", 00:25:37.512 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:37.512 "is_configured": true, 00:25:37.512 "data_offset": 2048, 00:25:37.512 "data_size": 63488 00:25:37.513 } 00:25:37.513 ] 00:25:37.513 }' 00:25:37.513 05:53:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:37.513 05:53:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:37.513 05:53:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:37.513 05:53:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:37.513 05:53:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:37.771 05:53:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:38.029 [2024-07-26 05:53:52.750518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:38.029 [2024-07-26 05:53:52.750560] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:38.029 [2024-07-26 05:53:52.750580] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x205f860 00:25:38.029 [2024-07-26 05:53:52.750592] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:38.029 [2024-07-26 05:53:52.750933] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:38.029 [2024-07-26 05:53:52.750951] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:38.029 [2024-07-26 05:53:52.751013] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:38.029 [2024-07-26 05:53:52.751025] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:38.029 [2024-07-26 05:53:52.751035] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:38.029 BaseBdev1 00:25:38.029 05:53:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.964 05:53:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.222 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:39.222 "name": "raid_bdev1", 00:25:39.222 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:39.222 "strip_size_kb": 0, 00:25:39.222 "state": "online", 00:25:39.222 "raid_level": "raid1", 00:25:39.222 "superblock": true, 00:25:39.222 "num_base_bdevs": 2, 00:25:39.222 "num_base_bdevs_discovered": 1, 00:25:39.222 "num_base_bdevs_operational": 1, 00:25:39.222 "base_bdevs_list": [ 00:25:39.222 { 00:25:39.222 "name": null, 00:25:39.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.222 "is_configured": false, 00:25:39.222 "data_offset": 2048, 00:25:39.222 "data_size": 63488 00:25:39.222 }, 00:25:39.222 { 00:25:39.222 "name": "BaseBdev2", 00:25:39.222 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:39.222 "is_configured": true, 00:25:39.222 "data_offset": 2048, 00:25:39.222 "data_size": 63488 00:25:39.222 } 00:25:39.222 ] 00:25:39.222 }' 00:25:39.222 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:39.222 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:39.788 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:39.788 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:39.788 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:39.788 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:39.788 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:39.788 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.788 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.047 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:40.047 "name": "raid_bdev1", 00:25:40.047 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:40.047 "strip_size_kb": 0, 00:25:40.047 "state": "online", 00:25:40.047 "raid_level": "raid1", 00:25:40.047 "superblock": true, 00:25:40.047 "num_base_bdevs": 2, 00:25:40.047 "num_base_bdevs_discovered": 1, 00:25:40.047 "num_base_bdevs_operational": 1, 00:25:40.047 "base_bdevs_list": [ 00:25:40.047 { 00:25:40.047 "name": null, 00:25:40.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.047 "is_configured": false, 00:25:40.047 "data_offset": 2048, 00:25:40.047 "data_size": 63488 00:25:40.047 }, 00:25:40.047 { 00:25:40.047 "name": "BaseBdev2", 00:25:40.047 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:40.047 "is_configured": true, 00:25:40.047 "data_offset": 2048, 00:25:40.047 "data_size": 63488 00:25:40.047 } 00:25:40.047 ] 00:25:40.047 }' 00:25:40.047 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:40.047 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:40.047 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:40.308 05:53:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:40.308 [2024-07-26 05:53:55.205066] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:40.308 [2024-07-26 05:53:55.205186] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:40.308 [2024-07-26 05:53:55.205201] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:40.308 request: 00:25:40.308 { 00:25:40.308 "base_bdev": "BaseBdev1", 00:25:40.308 "raid_bdev": "raid_bdev1", 00:25:40.308 "method": "bdev_raid_add_base_bdev", 00:25:40.308 "req_id": 1 00:25:40.308 } 00:25:40.308 Got JSON-RPC error response 00:25:40.308 response: 00:25:40.308 { 00:25:40.308 "code": -22, 00:25:40.308 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:40.308 } 00:25:40.567 05:53:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:25:40.567 05:53:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:40.567 05:53:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:40.567 05:53:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:40.567 05:53:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.504 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.763 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:41.763 "name": "raid_bdev1", 00:25:41.763 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:41.763 "strip_size_kb": 0, 00:25:41.763 "state": "online", 00:25:41.763 "raid_level": "raid1", 00:25:41.763 "superblock": true, 00:25:41.763 "num_base_bdevs": 2, 00:25:41.763 "num_base_bdevs_discovered": 1, 00:25:41.763 "num_base_bdevs_operational": 1, 00:25:41.763 "base_bdevs_list": [ 00:25:41.763 { 00:25:41.763 "name": null, 00:25:41.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:41.763 "is_configured": false, 00:25:41.763 "data_offset": 2048, 00:25:41.763 "data_size": 63488 00:25:41.763 }, 00:25:41.763 { 00:25:41.763 "name": "BaseBdev2", 00:25:41.763 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:41.763 "is_configured": true, 00:25:41.763 "data_offset": 2048, 00:25:41.763 "data_size": 63488 00:25:41.763 } 00:25:41.763 ] 00:25:41.763 }' 00:25:41.763 05:53:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:41.763 05:53:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:42.330 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:42.330 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:42.331 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:42.331 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:42.331 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:42.331 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.331 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.590 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:42.590 "name": "raid_bdev1", 00:25:42.590 "uuid": "fef35efb-54fb-40f6-b333-aa19a4c3db5a", 00:25:42.590 "strip_size_kb": 0, 00:25:42.590 "state": "online", 00:25:42.590 "raid_level": "raid1", 00:25:42.590 "superblock": true, 00:25:42.590 "num_base_bdevs": 2, 00:25:42.590 "num_base_bdevs_discovered": 1, 00:25:42.590 "num_base_bdevs_operational": 1, 00:25:42.590 "base_bdevs_list": [ 00:25:42.590 { 00:25:42.590 "name": null, 00:25:42.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.590 "is_configured": false, 00:25:42.590 "data_offset": 2048, 00:25:42.590 "data_size": 63488 00:25:42.590 }, 00:25:42.590 { 00:25:42.591 "name": "BaseBdev2", 00:25:42.591 "uuid": "adee0bb0-299e-5243-b2ad-3bda19b28d09", 00:25:42.591 "is_configured": true, 00:25:42.591 "data_offset": 2048, 00:25:42.591 "data_size": 63488 00:25:42.591 } 00:25:42.591 ] 00:25:42.591 }' 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1236730 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1236730 ']' 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1236730 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1236730 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1236730' 00:25:42.591 killing process with pid 1236730 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1236730 00:25:42.591 Received shutdown signal, test time was about 60.000000 seconds 00:25:42.591 00:25:42.591 Latency(us) 00:25:42.591 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:42.591 =================================================================================================================== 00:25:42.591 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:42.591 [2024-07-26 05:53:57.470978] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:42.591 [2024-07-26 05:53:57.471074] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:42.591 [2024-07-26 05:53:57.471118] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:42.591 [2024-07-26 05:53:57.471131] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2062f40 name raid_bdev1, state offline 00:25:42.591 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1236730 00:25:42.850 [2024-07-26 05:53:57.499153] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:42.850 05:53:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:25:42.850 00:25:42.850 real 0m35.641s 00:25:42.850 user 0m51.617s 00:25:42.850 sys 0m6.786s 00:25:42.850 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:42.850 05:53:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:42.850 ************************************ 00:25:42.850 END TEST raid_rebuild_test_sb 00:25:42.850 ************************************ 00:25:43.110 05:53:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:43.110 05:53:57 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:25:43.110 05:53:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:43.110 05:53:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:43.110 05:53:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:43.110 ************************************ 00:25:43.110 START TEST raid_rebuild_test_io 00:25:43.110 ************************************ 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1241767 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1241767 /var/tmp/spdk-raid.sock 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1241767 ']' 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:43.110 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:43.110 05:53:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:43.110 [2024-07-26 05:53:57.876033] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:25:43.110 [2024-07-26 05:53:57.876090] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1241767 ] 00:25:43.110 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:43.110 Zero copy mechanism will not be used. 00:25:43.110 [2024-07-26 05:53:57.990856] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:43.368 [2024-07-26 05:53:58.094075] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:43.368 [2024-07-26 05:53:58.160499] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:43.368 [2024-07-26 05:53:58.160538] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:43.934 05:53:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:43.934 05:53:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:25:43.934 05:53:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:43.934 05:53:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:44.192 BaseBdev1_malloc 00:25:44.192 05:53:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:44.451 [2024-07-26 05:53:59.218471] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:44.451 [2024-07-26 05:53:59.218522] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:44.451 [2024-07-26 05:53:59.218546] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1814d40 00:25:44.451 [2024-07-26 05:53:59.218559] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:44.451 [2024-07-26 05:53:59.220238] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:44.451 [2024-07-26 05:53:59.220269] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:44.451 BaseBdev1 00:25:44.451 05:53:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:44.451 05:53:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:44.710 BaseBdev2_malloc 00:25:44.710 05:53:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:44.968 [2024-07-26 05:53:59.704745] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:44.968 [2024-07-26 05:53:59.704794] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:44.968 [2024-07-26 05:53:59.704817] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1815860 00:25:44.968 [2024-07-26 05:53:59.704829] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:44.968 [2024-07-26 05:53:59.706282] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:44.968 [2024-07-26 05:53:59.706311] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:44.968 BaseBdev2 00:25:44.968 05:53:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:45.227 spare_malloc 00:25:45.227 05:53:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:45.486 spare_delay 00:25:45.486 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:45.746 [2024-07-26 05:54:00.447389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:45.746 [2024-07-26 05:54:00.447438] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:45.746 [2024-07-26 05:54:00.447459] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19c3ec0 00:25:45.746 [2024-07-26 05:54:00.447472] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:45.746 [2024-07-26 05:54:00.448998] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:45.746 [2024-07-26 05:54:00.449027] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:45.746 spare 00:25:45.746 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:46.005 [2024-07-26 05:54:00.696059] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:46.005 [2024-07-26 05:54:00.697237] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:46.005 [2024-07-26 05:54:00.697314] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19c5070 00:25:46.005 [2024-07-26 05:54:00.697326] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:46.005 [2024-07-26 05:54:00.697532] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19be490 00:25:46.005 [2024-07-26 05:54:00.697679] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19c5070 00:25:46.005 [2024-07-26 05:54:00.697690] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19c5070 00:25:46.005 [2024-07-26 05:54:00.697797] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:46.005 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:46.005 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:46.005 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:46.005 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:46.005 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:46.005 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:46.005 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:46.005 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:46.005 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:46.006 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:46.006 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.006 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.264 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:46.265 "name": "raid_bdev1", 00:25:46.265 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:46.265 "strip_size_kb": 0, 00:25:46.265 "state": "online", 00:25:46.265 "raid_level": "raid1", 00:25:46.265 "superblock": false, 00:25:46.265 "num_base_bdevs": 2, 00:25:46.265 "num_base_bdevs_discovered": 2, 00:25:46.265 "num_base_bdevs_operational": 2, 00:25:46.265 "base_bdevs_list": [ 00:25:46.265 { 00:25:46.265 "name": "BaseBdev1", 00:25:46.265 "uuid": "45587897-68fe-5fcc-afc7-7feb8dbdbca2", 00:25:46.265 "is_configured": true, 00:25:46.265 "data_offset": 0, 00:25:46.265 "data_size": 65536 00:25:46.265 }, 00:25:46.265 { 00:25:46.265 "name": "BaseBdev2", 00:25:46.265 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:46.265 "is_configured": true, 00:25:46.265 "data_offset": 0, 00:25:46.265 "data_size": 65536 00:25:46.265 } 00:25:46.265 ] 00:25:46.265 }' 00:25:46.265 05:54:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:46.265 05:54:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:46.831 05:54:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:46.831 05:54:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:47.089 [2024-07-26 05:54:01.747072] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:47.089 05:54:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:47.089 05:54:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.089 05:54:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:47.347 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:47.347 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:47.347 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:47.347 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:47.347 [2024-07-26 05:54:02.129947] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19bfbd0 00:25:47.347 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:47.347 Zero copy mechanism will not be used. 00:25:47.347 Running I/O for 60 seconds... 00:25:47.347 [2024-07-26 05:54:02.247507] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:47.347 [2024-07-26 05:54:02.247696] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x19bfbd0 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.606 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.864 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:47.864 "name": "raid_bdev1", 00:25:47.864 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:47.864 "strip_size_kb": 0, 00:25:47.864 "state": "online", 00:25:47.864 "raid_level": "raid1", 00:25:47.864 "superblock": false, 00:25:47.864 "num_base_bdevs": 2, 00:25:47.864 "num_base_bdevs_discovered": 1, 00:25:47.864 "num_base_bdevs_operational": 1, 00:25:47.864 "base_bdevs_list": [ 00:25:47.864 { 00:25:47.864 "name": null, 00:25:47.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:47.864 "is_configured": false, 00:25:47.864 "data_offset": 0, 00:25:47.864 "data_size": 65536 00:25:47.864 }, 00:25:47.864 { 00:25:47.864 "name": "BaseBdev2", 00:25:47.864 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:47.864 "is_configured": true, 00:25:47.865 "data_offset": 0, 00:25:47.865 "data_size": 65536 00:25:47.865 } 00:25:47.865 ] 00:25:47.865 }' 00:25:47.865 05:54:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:47.865 05:54:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:48.431 05:54:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:48.689 [2024-07-26 05:54:03.385589] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:48.689 [2024-07-26 05:54:03.444527] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19478b0 00:25:48.689 05:54:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:48.689 [2024-07-26 05:54:03.447023] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:48.689 [2024-07-26 05:54:03.564961] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:48.689 [2024-07-26 05:54:03.565357] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:48.946 [2024-07-26 05:54:03.792877] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:48.946 [2024-07-26 05:54:03.793018] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:49.512 [2024-07-26 05:54:04.312828] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:49.771 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:49.771 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:49.771 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:49.771 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:49.771 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:49.771 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.771 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.771 [2024-07-26 05:54:04.652326] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:50.029 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:50.029 "name": "raid_bdev1", 00:25:50.029 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:50.029 "strip_size_kb": 0, 00:25:50.029 "state": "online", 00:25:50.029 "raid_level": "raid1", 00:25:50.029 "superblock": false, 00:25:50.029 "num_base_bdevs": 2, 00:25:50.029 "num_base_bdevs_discovered": 2, 00:25:50.029 "num_base_bdevs_operational": 2, 00:25:50.029 "process": { 00:25:50.029 "type": "rebuild", 00:25:50.029 "target": "spare", 00:25:50.029 "progress": { 00:25:50.029 "blocks": 14336, 00:25:50.029 "percent": 21 00:25:50.029 } 00:25:50.029 }, 00:25:50.029 "base_bdevs_list": [ 00:25:50.029 { 00:25:50.029 "name": "spare", 00:25:50.029 "uuid": "74105e58-0b25-598a-b809-8d17bf7466e5", 00:25:50.029 "is_configured": true, 00:25:50.029 "data_offset": 0, 00:25:50.029 "data_size": 65536 00:25:50.029 }, 00:25:50.029 { 00:25:50.029 "name": "BaseBdev2", 00:25:50.029 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:50.029 "is_configured": true, 00:25:50.029 "data_offset": 0, 00:25:50.029 "data_size": 65536 00:25:50.029 } 00:25:50.029 ] 00:25:50.029 }' 00:25:50.029 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:50.029 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:50.029 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:50.029 [2024-07-26 05:54:04.780050] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:50.029 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:50.029 05:54:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:50.288 [2024-07-26 05:54:05.027377] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:50.288 [2024-07-26 05:54:05.162645] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:50.288 [2024-07-26 05:54:05.172912] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:50.288 [2024-07-26 05:54:05.172946] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:50.288 [2024-07-26 05:54:05.172963] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:50.288 [2024-07-26 05:54:05.195229] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x19bfbd0 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:50.546 "name": "raid_bdev1", 00:25:50.546 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:50.546 "strip_size_kb": 0, 00:25:50.546 "state": "online", 00:25:50.546 "raid_level": "raid1", 00:25:50.546 "superblock": false, 00:25:50.546 "num_base_bdevs": 2, 00:25:50.546 "num_base_bdevs_discovered": 1, 00:25:50.546 "num_base_bdevs_operational": 1, 00:25:50.546 "base_bdevs_list": [ 00:25:50.546 { 00:25:50.546 "name": null, 00:25:50.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:50.546 "is_configured": false, 00:25:50.546 "data_offset": 0, 00:25:50.546 "data_size": 65536 00:25:50.546 }, 00:25:50.546 { 00:25:50.546 "name": "BaseBdev2", 00:25:50.546 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:50.546 "is_configured": true, 00:25:50.546 "data_offset": 0, 00:25:50.546 "data_size": 65536 00:25:50.546 } 00:25:50.546 ] 00:25:50.546 }' 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:50.546 05:54:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:51.482 "name": "raid_bdev1", 00:25:51.482 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:51.482 "strip_size_kb": 0, 00:25:51.482 "state": "online", 00:25:51.482 "raid_level": "raid1", 00:25:51.482 "superblock": false, 00:25:51.482 "num_base_bdevs": 2, 00:25:51.482 "num_base_bdevs_discovered": 1, 00:25:51.482 "num_base_bdevs_operational": 1, 00:25:51.482 "base_bdevs_list": [ 00:25:51.482 { 00:25:51.482 "name": null, 00:25:51.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.482 "is_configured": false, 00:25:51.482 "data_offset": 0, 00:25:51.482 "data_size": 65536 00:25:51.482 }, 00:25:51.482 { 00:25:51.482 "name": "BaseBdev2", 00:25:51.482 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:51.482 "is_configured": true, 00:25:51.482 "data_offset": 0, 00:25:51.482 "data_size": 65536 00:25:51.482 } 00:25:51.482 ] 00:25:51.482 }' 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:51.482 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:51.763 [2024-07-26 05:54:06.568067] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:51.763 05:54:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:51.763 [2024-07-26 05:54:06.644388] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19482a0 00:25:51.763 [2024-07-26 05:54:06.645967] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:52.034 [2024-07-26 05:54:06.765155] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:52.034 [2024-07-26 05:54:06.765710] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:52.292 [2024-07-26 05:54:07.011916] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:52.292 [2024-07-26 05:54:07.012215] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:52.550 [2024-07-26 05:54:07.278795] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:52.809 [2024-07-26 05:54:07.489575] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:52.809 [2024-07-26 05:54:07.489764] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:52.809 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:52.809 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:52.809 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:52.809 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:52.809 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:52.809 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.809 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.068 [2024-07-26 05:54:07.811380] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:53.068 [2024-07-26 05:54:07.811931] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:53.068 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:53.068 "name": "raid_bdev1", 00:25:53.068 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:53.068 "strip_size_kb": 0, 00:25:53.068 "state": "online", 00:25:53.068 "raid_level": "raid1", 00:25:53.068 "superblock": false, 00:25:53.068 "num_base_bdevs": 2, 00:25:53.068 "num_base_bdevs_discovered": 2, 00:25:53.068 "num_base_bdevs_operational": 2, 00:25:53.068 "process": { 00:25:53.068 "type": "rebuild", 00:25:53.068 "target": "spare", 00:25:53.068 "progress": { 00:25:53.068 "blocks": 14336, 00:25:53.068 "percent": 21 00:25:53.068 } 00:25:53.068 }, 00:25:53.068 "base_bdevs_list": [ 00:25:53.068 { 00:25:53.068 "name": "spare", 00:25:53.068 "uuid": "74105e58-0b25-598a-b809-8d17bf7466e5", 00:25:53.068 "is_configured": true, 00:25:53.068 "data_offset": 0, 00:25:53.068 "data_size": 65536 00:25:53.068 }, 00:25:53.068 { 00:25:53.068 "name": "BaseBdev2", 00:25:53.068 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:53.068 "is_configured": true, 00:25:53.068 "data_offset": 0, 00:25:53.068 "data_size": 65536 00:25:53.068 } 00:25:53.068 ] 00:25:53.068 }' 00:25:53.068 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:53.068 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:53.068 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=817 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.326 05:54:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.585 05:54:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:53.585 "name": "raid_bdev1", 00:25:53.585 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:53.585 "strip_size_kb": 0, 00:25:53.585 "state": "online", 00:25:53.585 "raid_level": "raid1", 00:25:53.585 "superblock": false, 00:25:53.585 "num_base_bdevs": 2, 00:25:53.585 "num_base_bdevs_discovered": 2, 00:25:53.585 "num_base_bdevs_operational": 2, 00:25:53.585 "process": { 00:25:53.585 "type": "rebuild", 00:25:53.585 "target": "spare", 00:25:53.585 "progress": { 00:25:53.585 "blocks": 20480, 00:25:53.585 "percent": 31 00:25:53.585 } 00:25:53.585 }, 00:25:53.585 "base_bdevs_list": [ 00:25:53.585 { 00:25:53.585 "name": "spare", 00:25:53.585 "uuid": "74105e58-0b25-598a-b809-8d17bf7466e5", 00:25:53.585 "is_configured": true, 00:25:53.585 "data_offset": 0, 00:25:53.585 "data_size": 65536 00:25:53.585 }, 00:25:53.585 { 00:25:53.585 "name": "BaseBdev2", 00:25:53.585 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:53.585 "is_configured": true, 00:25:53.585 "data_offset": 0, 00:25:53.585 "data_size": 65536 00:25:53.585 } 00:25:53.585 ] 00:25:53.585 }' 00:25:53.585 05:54:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:53.585 05:54:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:53.585 05:54:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:53.585 [2024-07-26 05:54:08.307709] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:53.585 [2024-07-26 05:54:08.307895] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:53.585 05:54:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:53.585 05:54:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:53.843 [2024-07-26 05:54:08.538389] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:53.843 [2024-07-26 05:54:08.658923] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:54.102 [2024-07-26 05:54:08.887708] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:25:54.669 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:54.669 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:54.669 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:54.669 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:54.669 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:54.669 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:54.669 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.669 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.669 [2024-07-26 05:54:09.445844] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:25:54.927 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:54.927 "name": "raid_bdev1", 00:25:54.927 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:54.927 "strip_size_kb": 0, 00:25:54.927 "state": "online", 00:25:54.927 "raid_level": "raid1", 00:25:54.927 "superblock": false, 00:25:54.927 "num_base_bdevs": 2, 00:25:54.927 "num_base_bdevs_discovered": 2, 00:25:54.927 "num_base_bdevs_operational": 2, 00:25:54.927 "process": { 00:25:54.927 "type": "rebuild", 00:25:54.927 "target": "spare", 00:25:54.927 "progress": { 00:25:54.927 "blocks": 43008, 00:25:54.927 "percent": 65 00:25:54.927 } 00:25:54.927 }, 00:25:54.927 "base_bdevs_list": [ 00:25:54.927 { 00:25:54.927 "name": "spare", 00:25:54.927 "uuid": "74105e58-0b25-598a-b809-8d17bf7466e5", 00:25:54.927 "is_configured": true, 00:25:54.927 "data_offset": 0, 00:25:54.927 "data_size": 65536 00:25:54.927 }, 00:25:54.927 { 00:25:54.927 "name": "BaseBdev2", 00:25:54.927 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:54.927 "is_configured": true, 00:25:54.927 "data_offset": 0, 00:25:54.927 "data_size": 65536 00:25:54.927 } 00:25:54.927 ] 00:25:54.927 }' 00:25:54.927 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:54.927 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:54.927 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:54.927 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:54.927 05:54:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:55.862 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:55.862 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:55.862 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:55.862 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:55.862 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:55.862 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:55.862 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.862 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.862 [2024-07-26 05:54:10.764066] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:56.120 [2024-07-26 05:54:10.872301] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:56.120 [2024-07-26 05:54:10.874403] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:56.120 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:56.120 "name": "raid_bdev1", 00:25:56.120 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:56.120 "strip_size_kb": 0, 00:25:56.120 "state": "online", 00:25:56.120 "raid_level": "raid1", 00:25:56.120 "superblock": false, 00:25:56.120 "num_base_bdevs": 2, 00:25:56.120 "num_base_bdevs_discovered": 2, 00:25:56.120 "num_base_bdevs_operational": 2, 00:25:56.120 "base_bdevs_list": [ 00:25:56.120 { 00:25:56.120 "name": "spare", 00:25:56.120 "uuid": "74105e58-0b25-598a-b809-8d17bf7466e5", 00:25:56.120 "is_configured": true, 00:25:56.120 "data_offset": 0, 00:25:56.120 "data_size": 65536 00:25:56.120 }, 00:25:56.120 { 00:25:56.120 "name": "BaseBdev2", 00:25:56.120 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:56.120 "is_configured": true, 00:25:56.120 "data_offset": 0, 00:25:56.120 "data_size": 65536 00:25:56.120 } 00:25:56.120 ] 00:25:56.120 }' 00:25:56.120 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.120 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:56.120 05:54:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:56.120 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:56.120 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:25:56.120 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:56.120 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:56.120 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:56.120 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:56.120 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:56.120 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.120 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.378 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:56.378 "name": "raid_bdev1", 00:25:56.378 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:56.378 "strip_size_kb": 0, 00:25:56.378 "state": "online", 00:25:56.378 "raid_level": "raid1", 00:25:56.378 "superblock": false, 00:25:56.378 "num_base_bdevs": 2, 00:25:56.378 "num_base_bdevs_discovered": 2, 00:25:56.378 "num_base_bdevs_operational": 2, 00:25:56.378 "base_bdevs_list": [ 00:25:56.378 { 00:25:56.378 "name": "spare", 00:25:56.378 "uuid": "74105e58-0b25-598a-b809-8d17bf7466e5", 00:25:56.378 "is_configured": true, 00:25:56.378 "data_offset": 0, 00:25:56.378 "data_size": 65536 00:25:56.378 }, 00:25:56.378 { 00:25:56.378 "name": "BaseBdev2", 00:25:56.378 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:56.378 "is_configured": true, 00:25:56.378 "data_offset": 0, 00:25:56.378 "data_size": 65536 00:25:56.378 } 00:25:56.378 ] 00:25:56.378 }' 00:25:56.378 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.637 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.895 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:56.895 "name": "raid_bdev1", 00:25:56.895 "uuid": "e26a8c48-f0f5-4b66-b299-72caa147bd19", 00:25:56.895 "strip_size_kb": 0, 00:25:56.895 "state": "online", 00:25:56.895 "raid_level": "raid1", 00:25:56.895 "superblock": false, 00:25:56.895 "num_base_bdevs": 2, 00:25:56.895 "num_base_bdevs_discovered": 2, 00:25:56.895 "num_base_bdevs_operational": 2, 00:25:56.895 "base_bdevs_list": [ 00:25:56.895 { 00:25:56.895 "name": "spare", 00:25:56.895 "uuid": "74105e58-0b25-598a-b809-8d17bf7466e5", 00:25:56.895 "is_configured": true, 00:25:56.895 "data_offset": 0, 00:25:56.895 "data_size": 65536 00:25:56.895 }, 00:25:56.895 { 00:25:56.895 "name": "BaseBdev2", 00:25:56.895 "uuid": "9a2a5020-37ee-5530-8396-7839fe145e60", 00:25:56.895 "is_configured": true, 00:25:56.895 "data_offset": 0, 00:25:56.895 "data_size": 65536 00:25:56.895 } 00:25:56.895 ] 00:25:56.895 }' 00:25:56.895 05:54:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:56.895 05:54:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:57.462 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:57.721 [2024-07-26 05:54:12.429083] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:57.721 [2024-07-26 05:54:12.429116] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:57.721 00:25:57.721 Latency(us) 00:25:57.721 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:57.721 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:57.721 raid_bdev1 : 10.37 103.59 310.77 0.00 0.00 12760.24 293.84 118534.68 00:25:57.721 =================================================================================================================== 00:25:57.721 Total : 103.59 310.77 0.00 0.00 12760.24 293.84 118534.68 00:25:57.721 [2024-07-26 05:54:12.529337] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:57.721 [2024-07-26 05:54:12.529366] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:57.721 [2024-07-26 05:54:12.529441] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:57.721 [2024-07-26 05:54:12.529453] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19c5070 name raid_bdev1, state offline 00:25:57.721 0 00:25:57.721 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.721 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:57.979 05:54:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:58.238 /dev/nbd0 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:58.238 1+0 records in 00:25:58.238 1+0 records out 00:25:58.238 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290592 s, 14.1 MB/s 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:58.238 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:25:58.497 /dev/nbd1 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:58.497 1+0 records in 00:25:58.497 1+0 records out 00:25:58.497 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253668 s, 16.1 MB/s 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:58.497 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:58.754 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1241767 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1241767 ']' 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1241767 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1241767 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1241767' 00:25:59.321 killing process with pid 1241767 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1241767 00:25:59.321 Received shutdown signal, test time was about 11.825553 seconds 00:25:59.321 00:25:59.321 Latency(us) 00:25:59.321 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:59.321 =================================================================================================================== 00:25:59.321 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:59.321 [2024-07-26 05:54:13.986603] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:59.321 05:54:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1241767 00:25:59.321 [2024-07-26 05:54:14.009031] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:59.580 00:25:59.580 real 0m16.432s 00:25:59.580 user 0m25.270s 00:25:59.580 sys 0m2.755s 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:59.580 ************************************ 00:25:59.580 END TEST raid_rebuild_test_io 00:25:59.580 ************************************ 00:25:59.580 05:54:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:59.580 05:54:14 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:25:59.580 05:54:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:59.580 05:54:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:59.580 05:54:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:59.580 ************************************ 00:25:59.580 START TEST raid_rebuild_test_sb_io 00:25:59.580 ************************************ 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1244609 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1244609 /var/tmp/spdk-raid.sock 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1244609 ']' 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:59.580 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:59.580 05:54:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:59.580 [2024-07-26 05:54:14.407287] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:25:59.580 [2024-07-26 05:54:14.407361] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1244609 ] 00:25:59.580 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:59.580 Zero copy mechanism will not be used. 00:25:59.839 [2024-07-26 05:54:14.538535] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:59.839 [2024-07-26 05:54:14.639276] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:59.839 [2024-07-26 05:54:14.703485] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:59.839 [2024-07-26 05:54:14.703526] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:00.774 05:54:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:00.774 05:54:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:26:00.774 05:54:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:00.774 05:54:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:00.774 BaseBdev1_malloc 00:26:00.774 05:54:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:01.033 [2024-07-26 05:54:15.821645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:01.033 [2024-07-26 05:54:15.821694] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:01.033 [2024-07-26 05:54:15.821717] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20ded40 00:26:01.033 [2024-07-26 05:54:15.821730] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:01.033 [2024-07-26 05:54:15.823347] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:01.033 [2024-07-26 05:54:15.823375] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:01.033 BaseBdev1 00:26:01.033 05:54:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:01.033 05:54:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:01.292 BaseBdev2_malloc 00:26:01.292 05:54:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:01.551 [2024-07-26 05:54:16.315762] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:01.551 [2024-07-26 05:54:16.315805] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:01.551 [2024-07-26 05:54:16.315829] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20df860 00:26:01.551 [2024-07-26 05:54:16.315842] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:01.551 [2024-07-26 05:54:16.317243] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:01.551 [2024-07-26 05:54:16.317270] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:01.551 BaseBdev2 00:26:01.551 05:54:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:01.811 spare_malloc 00:26:01.811 05:54:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:02.070 spare_delay 00:26:02.070 05:54:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:02.328 [2024-07-26 05:54:17.054262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:02.328 [2024-07-26 05:54:17.054304] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:02.328 [2024-07-26 05:54:17.054324] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x228dec0 00:26:02.328 [2024-07-26 05:54:17.054336] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:02.328 [2024-07-26 05:54:17.055739] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:02.328 [2024-07-26 05:54:17.055764] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:02.328 spare 00:26:02.328 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:02.587 [2024-07-26 05:54:17.298936] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:02.587 [2024-07-26 05:54:17.300087] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:02.587 [2024-07-26 05:54:17.300246] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x228f070 00:26:02.587 [2024-07-26 05:54:17.300260] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:02.588 [2024-07-26 05:54:17.300434] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2288490 00:26:02.588 [2024-07-26 05:54:17.300568] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x228f070 00:26:02.588 [2024-07-26 05:54:17.300578] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x228f070 00:26:02.588 [2024-07-26 05:54:17.300677] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.588 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.847 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:02.847 "name": "raid_bdev1", 00:26:02.847 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:02.847 "strip_size_kb": 0, 00:26:02.847 "state": "online", 00:26:02.847 "raid_level": "raid1", 00:26:02.847 "superblock": true, 00:26:02.847 "num_base_bdevs": 2, 00:26:02.847 "num_base_bdevs_discovered": 2, 00:26:02.847 "num_base_bdevs_operational": 2, 00:26:02.847 "base_bdevs_list": [ 00:26:02.847 { 00:26:02.847 "name": "BaseBdev1", 00:26:02.847 "uuid": "49a5dfa9-756b-5c64-8e4d-478e5d24a84a", 00:26:02.847 "is_configured": true, 00:26:02.847 "data_offset": 2048, 00:26:02.847 "data_size": 63488 00:26:02.847 }, 00:26:02.847 { 00:26:02.847 "name": "BaseBdev2", 00:26:02.847 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:02.847 "is_configured": true, 00:26:02.847 "data_offset": 2048, 00:26:02.847 "data_size": 63488 00:26:02.847 } 00:26:02.847 ] 00:26:02.847 }' 00:26:02.847 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:02.847 05:54:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:03.414 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:03.415 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:03.673 [2024-07-26 05:54:18.390040] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:03.673 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:26:03.673 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.673 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:03.932 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:26:03.932 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:03.932 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:03.932 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:03.932 [2024-07-26 05:54:18.768933] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x228fc50 00:26:03.932 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:03.932 Zero copy mechanism will not be used. 00:26:03.932 Running I/O for 60 seconds... 00:26:03.932 [2024-07-26 05:54:18.820966] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:03.932 [2024-07-26 05:54:18.837159] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x228fc50 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.190 05:54:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.449 05:54:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:04.449 "name": "raid_bdev1", 00:26:04.449 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:04.449 "strip_size_kb": 0, 00:26:04.449 "state": "online", 00:26:04.449 "raid_level": "raid1", 00:26:04.449 "superblock": true, 00:26:04.449 "num_base_bdevs": 2, 00:26:04.449 "num_base_bdevs_discovered": 1, 00:26:04.449 "num_base_bdevs_operational": 1, 00:26:04.449 "base_bdevs_list": [ 00:26:04.449 { 00:26:04.449 "name": null, 00:26:04.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.449 "is_configured": false, 00:26:04.449 "data_offset": 2048, 00:26:04.449 "data_size": 63488 00:26:04.449 }, 00:26:04.449 { 00:26:04.449 "name": "BaseBdev2", 00:26:04.449 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:04.449 "is_configured": true, 00:26:04.449 "data_offset": 2048, 00:26:04.449 "data_size": 63488 00:26:04.449 } 00:26:04.449 ] 00:26:04.449 }' 00:26:04.449 05:54:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:04.449 05:54:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:05.017 05:54:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:05.275 [2024-07-26 05:54:19.940809] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:05.275 05:54:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:05.275 [2024-07-26 05:54:20.009034] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f11e0 00:26:05.275 [2024-07-26 05:54:20.011418] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:05.275 [2024-07-26 05:54:20.149660] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:05.275 [2024-07-26 05:54:20.149996] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:05.533 [2024-07-26 05:54:20.277593] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:05.533 [2024-07-26 05:54:20.277776] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:05.792 [2024-07-26 05:54:20.609831] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:05.792 [2024-07-26 05:54:20.610151] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:06.069 [2024-07-26 05:54:20.738771] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:06.069 [2024-07-26 05:54:20.738993] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:06.345 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:06.345 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:06.345 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:06.345 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:06.345 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:06.345 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.345 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.345 [2024-07-26 05:54:21.097834] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:06.345 [2024-07-26 05:54:21.098063] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:06.603 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:06.603 "name": "raid_bdev1", 00:26:06.603 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:06.603 "strip_size_kb": 0, 00:26:06.603 "state": "online", 00:26:06.603 "raid_level": "raid1", 00:26:06.603 "superblock": true, 00:26:06.603 "num_base_bdevs": 2, 00:26:06.603 "num_base_bdevs_discovered": 2, 00:26:06.603 "num_base_bdevs_operational": 2, 00:26:06.603 "process": { 00:26:06.603 "type": "rebuild", 00:26:06.603 "target": "spare", 00:26:06.603 "progress": { 00:26:06.603 "blocks": 16384, 00:26:06.603 "percent": 25 00:26:06.603 } 00:26:06.603 }, 00:26:06.603 "base_bdevs_list": [ 00:26:06.604 { 00:26:06.604 "name": "spare", 00:26:06.604 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:06.604 "is_configured": true, 00:26:06.604 "data_offset": 2048, 00:26:06.604 "data_size": 63488 00:26:06.604 }, 00:26:06.604 { 00:26:06.604 "name": "BaseBdev2", 00:26:06.604 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:06.604 "is_configured": true, 00:26:06.604 "data_offset": 2048, 00:26:06.604 "data_size": 63488 00:26:06.604 } 00:26:06.604 ] 00:26:06.604 }' 00:26:06.604 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:06.604 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:06.604 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:06.604 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:06.604 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:06.862 [2024-07-26 05:54:21.582875] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:07.121 [2024-07-26 05:54:21.775118] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:07.121 [2024-07-26 05:54:21.776951] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:07.121 [2024-07-26 05:54:21.776979] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:07.121 [2024-07-26 05:54:21.776989] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:07.121 [2024-07-26 05:54:21.807422] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x228fc50 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.121 05:54:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.379 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:07.379 "name": "raid_bdev1", 00:26:07.379 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:07.379 "strip_size_kb": 0, 00:26:07.379 "state": "online", 00:26:07.379 "raid_level": "raid1", 00:26:07.379 "superblock": true, 00:26:07.379 "num_base_bdevs": 2, 00:26:07.379 "num_base_bdevs_discovered": 1, 00:26:07.379 "num_base_bdevs_operational": 1, 00:26:07.379 "base_bdevs_list": [ 00:26:07.379 { 00:26:07.379 "name": null, 00:26:07.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.379 "is_configured": false, 00:26:07.379 "data_offset": 2048, 00:26:07.379 "data_size": 63488 00:26:07.379 }, 00:26:07.379 { 00:26:07.379 "name": "BaseBdev2", 00:26:07.379 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:07.379 "is_configured": true, 00:26:07.379 "data_offset": 2048, 00:26:07.379 "data_size": 63488 00:26:07.379 } 00:26:07.379 ] 00:26:07.379 }' 00:26:07.379 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:07.379 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:07.946 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:07.946 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.946 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:07.946 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:07.946 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.946 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.946 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.204 05:54:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:08.204 "name": "raid_bdev1", 00:26:08.204 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:08.204 "strip_size_kb": 0, 00:26:08.204 "state": "online", 00:26:08.204 "raid_level": "raid1", 00:26:08.204 "superblock": true, 00:26:08.204 "num_base_bdevs": 2, 00:26:08.204 "num_base_bdevs_discovered": 1, 00:26:08.204 "num_base_bdevs_operational": 1, 00:26:08.204 "base_bdevs_list": [ 00:26:08.204 { 00:26:08.204 "name": null, 00:26:08.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.204 "is_configured": false, 00:26:08.204 "data_offset": 2048, 00:26:08.204 "data_size": 63488 00:26:08.204 }, 00:26:08.204 { 00:26:08.204 "name": "BaseBdev2", 00:26:08.204 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:08.204 "is_configured": true, 00:26:08.204 "data_offset": 2048, 00:26:08.204 "data_size": 63488 00:26:08.204 } 00:26:08.204 ] 00:26:08.204 }' 00:26:08.204 05:54:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:08.204 05:54:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:08.205 05:54:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:08.205 05:54:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:08.205 05:54:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:08.463 [2024-07-26 05:54:23.324242] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:08.720 05:54:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:08.720 [2024-07-26 05:54:23.393170] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x228fe60 00:26:08.720 [2024-07-26 05:54:23.394702] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:08.720 [2024-07-26 05:54:23.513569] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:08.720 [2024-07-26 05:54:23.513899] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:08.979 [2024-07-26 05:54:23.741661] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:08.979 [2024-07-26 05:54:23.741893] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:09.546 [2024-07-26 05:54:24.219527] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:09.546 [2024-07-26 05:54:24.219796] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:09.546 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:09.546 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:09.546 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:09.546 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:09.546 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:09.546 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.546 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:09.804 "name": "raid_bdev1", 00:26:09.804 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:09.804 "strip_size_kb": 0, 00:26:09.804 "state": "online", 00:26:09.804 "raid_level": "raid1", 00:26:09.804 "superblock": true, 00:26:09.804 "num_base_bdevs": 2, 00:26:09.804 "num_base_bdevs_discovered": 2, 00:26:09.804 "num_base_bdevs_operational": 2, 00:26:09.804 "process": { 00:26:09.804 "type": "rebuild", 00:26:09.804 "target": "spare", 00:26:09.804 "progress": { 00:26:09.804 "blocks": 12288, 00:26:09.804 "percent": 19 00:26:09.804 } 00:26:09.804 }, 00:26:09.804 "base_bdevs_list": [ 00:26:09.804 { 00:26:09.804 "name": "spare", 00:26:09.804 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:09.804 "is_configured": true, 00:26:09.804 "data_offset": 2048, 00:26:09.804 "data_size": 63488 00:26:09.804 }, 00:26:09.804 { 00:26:09.804 "name": "BaseBdev2", 00:26:09.804 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:09.804 "is_configured": true, 00:26:09.804 "data_offset": 2048, 00:26:09.804 "data_size": 63488 00:26:09.804 } 00:26:09.804 ] 00:26:09.804 }' 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:09.804 [2024-07-26 05:54:24.653658] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:09.804 [2024-07-26 05:54:24.653906] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:09.804 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=834 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.804 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.063 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:10.063 "name": "raid_bdev1", 00:26:10.063 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:10.063 "strip_size_kb": 0, 00:26:10.063 "state": "online", 00:26:10.063 "raid_level": "raid1", 00:26:10.063 "superblock": true, 00:26:10.063 "num_base_bdevs": 2, 00:26:10.063 "num_base_bdevs_discovered": 2, 00:26:10.063 "num_base_bdevs_operational": 2, 00:26:10.063 "process": { 00:26:10.063 "type": "rebuild", 00:26:10.063 "target": "spare", 00:26:10.063 "progress": { 00:26:10.063 "blocks": 16384, 00:26:10.063 "percent": 25 00:26:10.063 } 00:26:10.063 }, 00:26:10.063 "base_bdevs_list": [ 00:26:10.063 { 00:26:10.063 "name": "spare", 00:26:10.063 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:10.063 "is_configured": true, 00:26:10.063 "data_offset": 2048, 00:26:10.063 "data_size": 63488 00:26:10.063 }, 00:26:10.063 { 00:26:10.063 "name": "BaseBdev2", 00:26:10.063 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:10.063 "is_configured": true, 00:26:10.063 "data_offset": 2048, 00:26:10.063 "data_size": 63488 00:26:10.063 } 00:26:10.063 ] 00:26:10.063 }' 00:26:10.063 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:10.063 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:10.063 05:54:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:10.321 05:54:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:10.321 05:54:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:10.321 [2024-07-26 05:54:25.101948] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:10.579 [2024-07-26 05:54:25.341084] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:10.579 [2024-07-26 05:54:25.341526] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:10.837 [2024-07-26 05:54:25.569492] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:11.096 [2024-07-26 05:54:25.815257] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:11.353 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:11.353 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:11.353 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.353 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:11.353 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:11.353 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.353 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.353 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.353 [2024-07-26 05:54:26.053350] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:11.612 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.612 "name": "raid_bdev1", 00:26:11.612 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:11.612 "strip_size_kb": 0, 00:26:11.612 "state": "online", 00:26:11.612 "raid_level": "raid1", 00:26:11.612 "superblock": true, 00:26:11.612 "num_base_bdevs": 2, 00:26:11.612 "num_base_bdevs_discovered": 2, 00:26:11.612 "num_base_bdevs_operational": 2, 00:26:11.612 "process": { 00:26:11.612 "type": "rebuild", 00:26:11.612 "target": "spare", 00:26:11.612 "progress": { 00:26:11.612 "blocks": 34816, 00:26:11.612 "percent": 54 00:26:11.612 } 00:26:11.612 }, 00:26:11.612 "base_bdevs_list": [ 00:26:11.612 { 00:26:11.612 "name": "spare", 00:26:11.612 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:11.612 "is_configured": true, 00:26:11.612 "data_offset": 2048, 00:26:11.612 "data_size": 63488 00:26:11.612 }, 00:26:11.612 { 00:26:11.612 "name": "BaseBdev2", 00:26:11.612 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:11.612 "is_configured": true, 00:26:11.612 "data_offset": 2048, 00:26:11.612 "data_size": 63488 00:26:11.612 } 00:26:11.612 ] 00:26:11.612 }' 00:26:11.612 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.612 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:11.612 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:11.612 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:11.612 05:54:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:12.179 [2024-07-26 05:54:26.855779] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:26:12.437 [2024-07-26 05:54:27.204998] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:12.695 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:12.695 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:12.695 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:12.695 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:12.695 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:12.695 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:12.695 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.695 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.954 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:12.954 "name": "raid_bdev1", 00:26:12.954 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:12.954 "strip_size_kb": 0, 00:26:12.954 "state": "online", 00:26:12.954 "raid_level": "raid1", 00:26:12.954 "superblock": true, 00:26:12.954 "num_base_bdevs": 2, 00:26:12.954 "num_base_bdevs_discovered": 2, 00:26:12.954 "num_base_bdevs_operational": 2, 00:26:12.954 "process": { 00:26:12.954 "type": "rebuild", 00:26:12.954 "target": "spare", 00:26:12.954 "progress": { 00:26:12.954 "blocks": 59392, 00:26:12.954 "percent": 93 00:26:12.954 } 00:26:12.954 }, 00:26:12.954 "base_bdevs_list": [ 00:26:12.954 { 00:26:12.954 "name": "spare", 00:26:12.954 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:12.954 "is_configured": true, 00:26:12.954 "data_offset": 2048, 00:26:12.954 "data_size": 63488 00:26:12.954 }, 00:26:12.954 { 00:26:12.954 "name": "BaseBdev2", 00:26:12.954 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:12.954 "is_configured": true, 00:26:12.954 "data_offset": 2048, 00:26:12.954 "data_size": 63488 00:26:12.954 } 00:26:12.954 ] 00:26:12.954 }' 00:26:12.954 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:12.954 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:12.954 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:12.954 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:12.954 05:54:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:12.954 [2024-07-26 05:54:27.764738] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:13.213 [2024-07-26 05:54:27.865049] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:13.213 [2024-07-26 05:54:27.875232] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:14.147 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:14.147 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:14.147 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:14.147 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:14.147 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:14.147 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:14.147 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.147 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.147 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:14.147 "name": "raid_bdev1", 00:26:14.147 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:14.147 "strip_size_kb": 0, 00:26:14.147 "state": "online", 00:26:14.147 "raid_level": "raid1", 00:26:14.147 "superblock": true, 00:26:14.147 "num_base_bdevs": 2, 00:26:14.147 "num_base_bdevs_discovered": 2, 00:26:14.147 "num_base_bdevs_operational": 2, 00:26:14.147 "base_bdevs_list": [ 00:26:14.147 { 00:26:14.148 "name": "spare", 00:26:14.148 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:14.148 "is_configured": true, 00:26:14.148 "data_offset": 2048, 00:26:14.148 "data_size": 63488 00:26:14.148 }, 00:26:14.148 { 00:26:14.148 "name": "BaseBdev2", 00:26:14.148 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:14.148 "is_configured": true, 00:26:14.148 "data_offset": 2048, 00:26:14.148 "data_size": 63488 00:26:14.148 } 00:26:14.148 ] 00:26:14.148 }' 00:26:14.148 05:54:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:14.148 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:14.148 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:14.148 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:14.148 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:26:14.148 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:14.406 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:14.406 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:14.406 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:14.406 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:14.406 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.406 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.406 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:14.406 "name": "raid_bdev1", 00:26:14.406 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:14.406 "strip_size_kb": 0, 00:26:14.406 "state": "online", 00:26:14.406 "raid_level": "raid1", 00:26:14.406 "superblock": true, 00:26:14.406 "num_base_bdevs": 2, 00:26:14.406 "num_base_bdevs_discovered": 2, 00:26:14.406 "num_base_bdevs_operational": 2, 00:26:14.406 "base_bdevs_list": [ 00:26:14.406 { 00:26:14.406 "name": "spare", 00:26:14.406 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:14.406 "is_configured": true, 00:26:14.406 "data_offset": 2048, 00:26:14.406 "data_size": 63488 00:26:14.406 }, 00:26:14.406 { 00:26:14.406 "name": "BaseBdev2", 00:26:14.406 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:14.406 "is_configured": true, 00:26:14.406 "data_offset": 2048, 00:26:14.406 "data_size": 63488 00:26:14.406 } 00:26:14.406 ] 00:26:14.406 }' 00:26:14.406 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.665 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.924 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:14.924 "name": "raid_bdev1", 00:26:14.924 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:14.924 "strip_size_kb": 0, 00:26:14.924 "state": "online", 00:26:14.924 "raid_level": "raid1", 00:26:14.924 "superblock": true, 00:26:14.924 "num_base_bdevs": 2, 00:26:14.924 "num_base_bdevs_discovered": 2, 00:26:14.924 "num_base_bdevs_operational": 2, 00:26:14.924 "base_bdevs_list": [ 00:26:14.924 { 00:26:14.924 "name": "spare", 00:26:14.924 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:14.924 "is_configured": true, 00:26:14.924 "data_offset": 2048, 00:26:14.924 "data_size": 63488 00:26:14.924 }, 00:26:14.924 { 00:26:14.924 "name": "BaseBdev2", 00:26:14.924 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:14.924 "is_configured": true, 00:26:14.924 "data_offset": 2048, 00:26:14.924 "data_size": 63488 00:26:14.924 } 00:26:14.924 ] 00:26:14.924 }' 00:26:14.924 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:14.924 05:54:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:15.489 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:15.489 [2024-07-26 05:54:30.388701] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:15.489 [2024-07-26 05:54:30.388736] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:15.747 00:26:15.747 Latency(us) 00:26:15.747 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:15.747 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:15.747 raid_bdev1 : 11.61 93.63 280.90 0.00 0.00 14437.22 292.06 120358.29 00:26:15.747 =================================================================================================================== 00:26:15.747 Total : 93.63 280.90 0.00 0.00 14437.22 292.06 120358.29 00:26:15.747 [2024-07-26 05:54:30.412761] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:15.747 [2024-07-26 05:54:30.412789] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:15.747 [2024-07-26 05:54:30.412864] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:15.747 [2024-07-26 05:54:30.412876] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x228f070 name raid_bdev1, state offline 00:26:15.747 0 00:26:15.747 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.747 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:16.005 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:16.005 /dev/nbd0 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:16.262 1+0 records in 00:26:16.262 1+0 records out 00:26:16.262 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280567 s, 14.6 MB/s 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:16.262 05:54:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:26:16.520 /dev/nbd1 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:16.520 1+0 records in 00:26:16.520 1+0 records out 00:26:16.520 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00030968 s, 13.2 MB/s 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:16.520 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:16.778 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:17.036 05:54:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:17.295 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:17.553 [2024-07-26 05:54:32.334181] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:17.553 [2024-07-26 05:54:32.334231] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:17.553 [2024-07-26 05:54:32.334253] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20de490 00:26:17.553 [2024-07-26 05:54:32.334265] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:17.553 [2024-07-26 05:54:32.336197] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:17.553 [2024-07-26 05:54:32.336226] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:17.553 [2024-07-26 05:54:32.336308] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:17.553 [2024-07-26 05:54:32.336334] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:17.553 [2024-07-26 05:54:32.336434] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:17.553 spare 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.553 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.553 [2024-07-26 05:54:32.436748] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x20f1990 00:26:17.553 [2024-07-26 05:54:32.436767] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:17.553 [2024-07-26 05:54:32.436966] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2287f50 00:26:17.553 [2024-07-26 05:54:32.437126] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20f1990 00:26:17.553 [2024-07-26 05:54:32.437137] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20f1990 00:26:17.553 [2024-07-26 05:54:32.437247] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:17.811 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:17.811 "name": "raid_bdev1", 00:26:17.811 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:17.811 "strip_size_kb": 0, 00:26:17.811 "state": "online", 00:26:17.811 "raid_level": "raid1", 00:26:17.811 "superblock": true, 00:26:17.811 "num_base_bdevs": 2, 00:26:17.811 "num_base_bdevs_discovered": 2, 00:26:17.811 "num_base_bdevs_operational": 2, 00:26:17.811 "base_bdevs_list": [ 00:26:17.811 { 00:26:17.811 "name": "spare", 00:26:17.811 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:17.811 "is_configured": true, 00:26:17.811 "data_offset": 2048, 00:26:17.811 "data_size": 63488 00:26:17.811 }, 00:26:17.811 { 00:26:17.811 "name": "BaseBdev2", 00:26:17.811 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:17.811 "is_configured": true, 00:26:17.811 "data_offset": 2048, 00:26:17.811 "data_size": 63488 00:26:17.811 } 00:26:17.811 ] 00:26:17.811 }' 00:26:17.811 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:17.811 05:54:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:18.377 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:18.377 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:18.377 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:18.377 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:18.377 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:18.377 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.377 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.635 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:18.635 "name": "raid_bdev1", 00:26:18.635 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:18.635 "strip_size_kb": 0, 00:26:18.635 "state": "online", 00:26:18.635 "raid_level": "raid1", 00:26:18.635 "superblock": true, 00:26:18.635 "num_base_bdevs": 2, 00:26:18.635 "num_base_bdevs_discovered": 2, 00:26:18.635 "num_base_bdevs_operational": 2, 00:26:18.635 "base_bdevs_list": [ 00:26:18.635 { 00:26:18.635 "name": "spare", 00:26:18.635 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:18.635 "is_configured": true, 00:26:18.635 "data_offset": 2048, 00:26:18.635 "data_size": 63488 00:26:18.635 }, 00:26:18.635 { 00:26:18.635 "name": "BaseBdev2", 00:26:18.635 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:18.635 "is_configured": true, 00:26:18.635 "data_offset": 2048, 00:26:18.635 "data_size": 63488 00:26:18.635 } 00:26:18.635 ] 00:26:18.635 }' 00:26:18.635 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:18.635 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:18.635 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:18.635 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:18.635 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.635 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:18.892 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:18.892 05:54:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:19.149 [2024-07-26 05:54:33.990880] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.149 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.407 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:19.407 "name": "raid_bdev1", 00:26:19.407 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:19.407 "strip_size_kb": 0, 00:26:19.407 "state": "online", 00:26:19.407 "raid_level": "raid1", 00:26:19.407 "superblock": true, 00:26:19.407 "num_base_bdevs": 2, 00:26:19.407 "num_base_bdevs_discovered": 1, 00:26:19.407 "num_base_bdevs_operational": 1, 00:26:19.407 "base_bdevs_list": [ 00:26:19.407 { 00:26:19.407 "name": null, 00:26:19.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.407 "is_configured": false, 00:26:19.407 "data_offset": 2048, 00:26:19.407 "data_size": 63488 00:26:19.407 }, 00:26:19.407 { 00:26:19.407 "name": "BaseBdev2", 00:26:19.407 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:19.407 "is_configured": true, 00:26:19.407 "data_offset": 2048, 00:26:19.407 "data_size": 63488 00:26:19.407 } 00:26:19.407 ] 00:26:19.407 }' 00:26:19.407 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:19.407 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:19.973 05:54:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:20.254 [2024-07-26 05:54:35.085950] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:20.254 [2024-07-26 05:54:35.086105] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:20.254 [2024-07-26 05:54:35.086121] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:20.254 [2024-07-26 05:54:35.086146] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:20.254 [2024-07-26 05:54:35.091368] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22913d0 00:26:20.254 [2024-07-26 05:54:35.093702] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:20.254 05:54:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:21.641 "name": "raid_bdev1", 00:26:21.641 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:21.641 "strip_size_kb": 0, 00:26:21.641 "state": "online", 00:26:21.641 "raid_level": "raid1", 00:26:21.641 "superblock": true, 00:26:21.641 "num_base_bdevs": 2, 00:26:21.641 "num_base_bdevs_discovered": 2, 00:26:21.641 "num_base_bdevs_operational": 2, 00:26:21.641 "process": { 00:26:21.641 "type": "rebuild", 00:26:21.641 "target": "spare", 00:26:21.641 "progress": { 00:26:21.641 "blocks": 24576, 00:26:21.641 "percent": 38 00:26:21.641 } 00:26:21.641 }, 00:26:21.641 "base_bdevs_list": [ 00:26:21.641 { 00:26:21.641 "name": "spare", 00:26:21.641 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:21.641 "is_configured": true, 00:26:21.641 "data_offset": 2048, 00:26:21.641 "data_size": 63488 00:26:21.641 }, 00:26:21.641 { 00:26:21.641 "name": "BaseBdev2", 00:26:21.641 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:21.641 "is_configured": true, 00:26:21.641 "data_offset": 2048, 00:26:21.641 "data_size": 63488 00:26:21.641 } 00:26:21.641 ] 00:26:21.641 }' 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:21.641 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:21.900 [2024-07-26 05:54:36.683337] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:21.900 [2024-07-26 05:54:36.706528] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:21.900 [2024-07-26 05:54:36.706574] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:21.900 [2024-07-26 05:54:36.706589] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:21.900 [2024-07-26 05:54:36.706597] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.900 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.159 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.159 "name": "raid_bdev1", 00:26:22.159 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:22.159 "strip_size_kb": 0, 00:26:22.159 "state": "online", 00:26:22.159 "raid_level": "raid1", 00:26:22.159 "superblock": true, 00:26:22.159 "num_base_bdevs": 2, 00:26:22.159 "num_base_bdevs_discovered": 1, 00:26:22.159 "num_base_bdevs_operational": 1, 00:26:22.159 "base_bdevs_list": [ 00:26:22.159 { 00:26:22.159 "name": null, 00:26:22.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.159 "is_configured": false, 00:26:22.159 "data_offset": 2048, 00:26:22.159 "data_size": 63488 00:26:22.159 }, 00:26:22.159 { 00:26:22.159 "name": "BaseBdev2", 00:26:22.159 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:22.159 "is_configured": true, 00:26:22.159 "data_offset": 2048, 00:26:22.159 "data_size": 63488 00:26:22.159 } 00:26:22.159 ] 00:26:22.159 }' 00:26:22.159 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.159 05:54:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:22.725 05:54:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:22.983 [2024-07-26 05:54:37.818287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:22.983 [2024-07-26 05:54:37.818339] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:22.983 [2024-07-26 05:54:37.818361] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x228e2c0 00:26:22.983 [2024-07-26 05:54:37.818374] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:22.983 [2024-07-26 05:54:37.818758] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:22.983 [2024-07-26 05:54:37.818778] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:22.983 [2024-07-26 05:54:37.818861] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:22.983 [2024-07-26 05:54:37.818873] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:22.983 [2024-07-26 05:54:37.818884] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:22.983 [2024-07-26 05:54:37.818903] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:22.983 [2024-07-26 05:54:37.824177] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2287f50 00:26:22.983 spare 00:26:22.983 [2024-07-26 05:54:37.825635] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:22.983 05:54:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:24.358 05:54:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:24.358 05:54:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:24.358 05:54:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:24.358 05:54:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:24.358 05:54:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:24.358 05:54:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.358 05:54:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.358 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:24.358 "name": "raid_bdev1", 00:26:24.358 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:24.358 "strip_size_kb": 0, 00:26:24.358 "state": "online", 00:26:24.358 "raid_level": "raid1", 00:26:24.358 "superblock": true, 00:26:24.358 "num_base_bdevs": 2, 00:26:24.358 "num_base_bdevs_discovered": 2, 00:26:24.358 "num_base_bdevs_operational": 2, 00:26:24.358 "process": { 00:26:24.358 "type": "rebuild", 00:26:24.358 "target": "spare", 00:26:24.358 "progress": { 00:26:24.358 "blocks": 22528, 00:26:24.358 "percent": 35 00:26:24.358 } 00:26:24.358 }, 00:26:24.358 "base_bdevs_list": [ 00:26:24.358 { 00:26:24.358 "name": "spare", 00:26:24.358 "uuid": "3550a152-c318-54c8-8faf-d96f8eb80676", 00:26:24.358 "is_configured": true, 00:26:24.358 "data_offset": 2048, 00:26:24.358 "data_size": 63488 00:26:24.358 }, 00:26:24.358 { 00:26:24.358 "name": "BaseBdev2", 00:26:24.358 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:24.358 "is_configured": true, 00:26:24.358 "data_offset": 2048, 00:26:24.358 "data_size": 63488 00:26:24.358 } 00:26:24.358 ] 00:26:24.358 }' 00:26:24.358 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:24.358 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:24.358 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:24.358 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:24.358 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:24.616 [2024-07-26 05:54:39.340924] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:24.616 [2024-07-26 05:54:39.438040] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:24.616 [2024-07-26 05:54:39.438092] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:24.616 [2024-07-26 05:54:39.438108] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:24.616 [2024-07-26 05:54:39.438116] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.616 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.875 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:24.875 "name": "raid_bdev1", 00:26:24.875 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:24.875 "strip_size_kb": 0, 00:26:24.875 "state": "online", 00:26:24.875 "raid_level": "raid1", 00:26:24.875 "superblock": true, 00:26:24.875 "num_base_bdevs": 2, 00:26:24.875 "num_base_bdevs_discovered": 1, 00:26:24.875 "num_base_bdevs_operational": 1, 00:26:24.875 "base_bdevs_list": [ 00:26:24.875 { 00:26:24.875 "name": null, 00:26:24.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.875 "is_configured": false, 00:26:24.875 "data_offset": 2048, 00:26:24.875 "data_size": 63488 00:26:24.875 }, 00:26:24.875 { 00:26:24.875 "name": "BaseBdev2", 00:26:24.875 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:24.875 "is_configured": true, 00:26:24.875 "data_offset": 2048, 00:26:24.875 "data_size": 63488 00:26:24.875 } 00:26:24.875 ] 00:26:24.875 }' 00:26:24.875 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:24.875 05:54:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:25.442 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:25.442 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.442 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:25.442 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:25.442 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.442 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.442 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.701 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.701 "name": "raid_bdev1", 00:26:25.701 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:25.701 "strip_size_kb": 0, 00:26:25.701 "state": "online", 00:26:25.701 "raid_level": "raid1", 00:26:25.701 "superblock": true, 00:26:25.701 "num_base_bdevs": 2, 00:26:25.701 "num_base_bdevs_discovered": 1, 00:26:25.701 "num_base_bdevs_operational": 1, 00:26:25.701 "base_bdevs_list": [ 00:26:25.701 { 00:26:25.701 "name": null, 00:26:25.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.701 "is_configured": false, 00:26:25.701 "data_offset": 2048, 00:26:25.701 "data_size": 63488 00:26:25.701 }, 00:26:25.701 { 00:26:25.701 "name": "BaseBdev2", 00:26:25.701 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:25.701 "is_configured": true, 00:26:25.701 "data_offset": 2048, 00:26:25.701 "data_size": 63488 00:26:25.701 } 00:26:25.701 ] 00:26:25.701 }' 00:26:25.701 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.959 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:25.959 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.959 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:25.959 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:26.218 05:54:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:26.476 [2024-07-26 05:54:41.131580] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:26.476 [2024-07-26 05:54:41.131627] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:26.476 [2024-07-26 05:54:41.131652] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x228d8c0 00:26:26.476 [2024-07-26 05:54:41.131665] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:26.476 [2024-07-26 05:54:41.132019] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:26.476 [2024-07-26 05:54:41.132037] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:26.476 [2024-07-26 05:54:41.132101] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:26.476 [2024-07-26 05:54:41.132114] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:26.476 [2024-07-26 05:54:41.132124] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:26.476 BaseBdev1 00:26:26.476 05:54:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.411 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.669 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:27.669 "name": "raid_bdev1", 00:26:27.669 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:27.669 "strip_size_kb": 0, 00:26:27.669 "state": "online", 00:26:27.669 "raid_level": "raid1", 00:26:27.669 "superblock": true, 00:26:27.669 "num_base_bdevs": 2, 00:26:27.669 "num_base_bdevs_discovered": 1, 00:26:27.669 "num_base_bdevs_operational": 1, 00:26:27.669 "base_bdevs_list": [ 00:26:27.669 { 00:26:27.669 "name": null, 00:26:27.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.669 "is_configured": false, 00:26:27.669 "data_offset": 2048, 00:26:27.669 "data_size": 63488 00:26:27.669 }, 00:26:27.669 { 00:26:27.669 "name": "BaseBdev2", 00:26:27.669 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:27.669 "is_configured": true, 00:26:27.669 "data_offset": 2048, 00:26:27.669 "data_size": 63488 00:26:27.669 } 00:26:27.669 ] 00:26:27.669 }' 00:26:27.669 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:27.669 05:54:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:28.237 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:28.237 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:28.237 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:28.237 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:28.237 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:28.237 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.237 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:28.495 "name": "raid_bdev1", 00:26:28.495 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:28.495 "strip_size_kb": 0, 00:26:28.495 "state": "online", 00:26:28.495 "raid_level": "raid1", 00:26:28.495 "superblock": true, 00:26:28.495 "num_base_bdevs": 2, 00:26:28.495 "num_base_bdevs_discovered": 1, 00:26:28.495 "num_base_bdevs_operational": 1, 00:26:28.495 "base_bdevs_list": [ 00:26:28.495 { 00:26:28.495 "name": null, 00:26:28.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:28.495 "is_configured": false, 00:26:28.495 "data_offset": 2048, 00:26:28.495 "data_size": 63488 00:26:28.495 }, 00:26:28.495 { 00:26:28.495 "name": "BaseBdev2", 00:26:28.495 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:28.495 "is_configured": true, 00:26:28.495 "data_offset": 2048, 00:26:28.495 "data_size": 63488 00:26:28.495 } 00:26:28.495 ] 00:26:28.495 }' 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:28.495 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:28.753 [2024-07-26 05:54:43.586474] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:28.753 [2024-07-26 05:54:43.586612] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:28.753 [2024-07-26 05:54:43.586630] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:28.753 request: 00:26:28.753 { 00:26:28.753 "base_bdev": "BaseBdev1", 00:26:28.753 "raid_bdev": "raid_bdev1", 00:26:28.753 "method": "bdev_raid_add_base_bdev", 00:26:28.753 "req_id": 1 00:26:28.753 } 00:26:28.753 Got JSON-RPC error response 00:26:28.753 response: 00:26:28.753 { 00:26:28.753 "code": -22, 00:26:28.753 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:28.753 } 00:26:28.753 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:26:28.753 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:28.753 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:28.753 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:28.753 05:54:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:30.128 "name": "raid_bdev1", 00:26:30.128 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:30.128 "strip_size_kb": 0, 00:26:30.128 "state": "online", 00:26:30.128 "raid_level": "raid1", 00:26:30.128 "superblock": true, 00:26:30.128 "num_base_bdevs": 2, 00:26:30.128 "num_base_bdevs_discovered": 1, 00:26:30.128 "num_base_bdevs_operational": 1, 00:26:30.128 "base_bdevs_list": [ 00:26:30.128 { 00:26:30.128 "name": null, 00:26:30.128 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.128 "is_configured": false, 00:26:30.128 "data_offset": 2048, 00:26:30.128 "data_size": 63488 00:26:30.128 }, 00:26:30.128 { 00:26:30.128 "name": "BaseBdev2", 00:26:30.128 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:30.128 "is_configured": true, 00:26:30.128 "data_offset": 2048, 00:26:30.128 "data_size": 63488 00:26:30.128 } 00:26:30.128 ] 00:26:30.128 }' 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:30.128 05:54:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:30.695 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:30.695 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:30.695 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:30.695 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:30.695 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:30.695 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.695 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:30.954 "name": "raid_bdev1", 00:26:30.954 "uuid": "55d79efc-3e3a-4848-b5e8-f5489258a816", 00:26:30.954 "strip_size_kb": 0, 00:26:30.954 "state": "online", 00:26:30.954 "raid_level": "raid1", 00:26:30.954 "superblock": true, 00:26:30.954 "num_base_bdevs": 2, 00:26:30.954 "num_base_bdevs_discovered": 1, 00:26:30.954 "num_base_bdevs_operational": 1, 00:26:30.954 "base_bdevs_list": [ 00:26:30.954 { 00:26:30.954 "name": null, 00:26:30.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.954 "is_configured": false, 00:26:30.954 "data_offset": 2048, 00:26:30.954 "data_size": 63488 00:26:30.954 }, 00:26:30.954 { 00:26:30.954 "name": "BaseBdev2", 00:26:30.954 "uuid": "30c86131-dcdb-5009-8291-30a96f208606", 00:26:30.954 "is_configured": true, 00:26:30.954 "data_offset": 2048, 00:26:30.954 "data_size": 63488 00:26:30.954 } 00:26:30.954 ] 00:26:30.954 }' 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1244609 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1244609 ']' 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1244609 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1244609 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1244609' 00:26:30.954 killing process with pid 1244609 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1244609 00:26:30.954 Received shutdown signal, test time was about 26.977201 seconds 00:26:30.954 00:26:30.954 Latency(us) 00:26:30.954 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:30.954 =================================================================================================================== 00:26:30.954 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:30.954 [2024-07-26 05:54:45.814257] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:30.954 [2024-07-26 05:54:45.814359] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:30.954 05:54:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1244609 00:26:30.954 [2024-07-26 05:54:45.814412] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:30.954 [2024-07-26 05:54:45.814427] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f1990 name raid_bdev1, state offline 00:26:30.954 [2024-07-26 05:54:45.839357] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:31.213 05:54:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:31.213 00:26:31.213 real 0m31.739s 00:26:31.213 user 0m49.487s 00:26:31.213 sys 0m4.630s 00:26:31.213 05:54:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:31.213 05:54:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:31.213 ************************************ 00:26:31.213 END TEST raid_rebuild_test_sb_io 00:26:31.213 ************************************ 00:26:31.471 05:54:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:31.471 05:54:46 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:26:31.471 05:54:46 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:26:31.471 05:54:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:31.471 05:54:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:31.471 05:54:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:31.471 ************************************ 00:26:31.471 START TEST raid_rebuild_test 00:26:31.471 ************************************ 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1249102 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1249102 /var/tmp/spdk-raid.sock 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1249102 ']' 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:31.471 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:31.471 05:54:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:31.471 [2024-07-26 05:54:46.229796] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:26:31.471 [2024-07-26 05:54:46.229862] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1249102 ] 00:26:31.471 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:31.471 Zero copy mechanism will not be used. 00:26:31.471 [2024-07-26 05:54:46.353715] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:31.730 [2024-07-26 05:54:46.460504] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:31.730 [2024-07-26 05:54:46.528448] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:31.730 [2024-07-26 05:54:46.528487] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:32.297 05:54:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:32.297 05:54:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:26:32.297 05:54:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:32.297 05:54:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:32.556 BaseBdev1_malloc 00:26:32.556 05:54:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:32.815 [2024-07-26 05:54:47.642875] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:32.815 [2024-07-26 05:54:47.642924] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:32.815 [2024-07-26 05:54:47.642950] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb98d40 00:26:32.815 [2024-07-26 05:54:47.642963] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:32.815 [2024-07-26 05:54:47.644690] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:32.815 [2024-07-26 05:54:47.644718] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:32.815 BaseBdev1 00:26:32.815 05:54:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:32.815 05:54:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:33.074 BaseBdev2_malloc 00:26:33.074 05:54:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:33.333 [2024-07-26 05:54:48.126247] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:33.333 [2024-07-26 05:54:48.126294] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.333 [2024-07-26 05:54:48.126321] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb99860 00:26:33.333 [2024-07-26 05:54:48.126334] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.333 [2024-07-26 05:54:48.127892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.333 [2024-07-26 05:54:48.127920] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:33.333 BaseBdev2 00:26:33.333 05:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:33.334 05:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:33.592 BaseBdev3_malloc 00:26:33.592 05:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:33.852 [2024-07-26 05:54:48.625463] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:33.852 [2024-07-26 05:54:48.625516] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.852 [2024-07-26 05:54:48.625539] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd468f0 00:26:33.852 [2024-07-26 05:54:48.625553] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.852 [2024-07-26 05:54:48.627154] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.852 [2024-07-26 05:54:48.627181] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:33.852 BaseBdev3 00:26:33.852 05:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:33.852 05:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:34.111 BaseBdev4_malloc 00:26:34.111 05:54:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:34.370 [2024-07-26 05:54:49.115718] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:34.370 [2024-07-26 05:54:49.115763] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:34.370 [2024-07-26 05:54:49.115784] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd45ad0 00:26:34.370 [2024-07-26 05:54:49.115797] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:34.370 [2024-07-26 05:54:49.117300] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:34.370 [2024-07-26 05:54:49.117327] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:34.370 BaseBdev4 00:26:34.370 05:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:34.629 spare_malloc 00:26:34.629 05:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:34.888 spare_delay 00:26:34.888 05:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:35.147 [2024-07-26 05:54:49.842214] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:35.147 [2024-07-26 05:54:49.842260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:35.147 [2024-07-26 05:54:49.842282] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd4a5b0 00:26:35.147 [2024-07-26 05:54:49.842295] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:35.147 [2024-07-26 05:54:49.843882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:35.147 [2024-07-26 05:54:49.843910] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:35.147 spare 00:26:35.147 05:54:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:35.406 [2024-07-26 05:54:50.086898] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:35.406 [2024-07-26 05:54:50.088270] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:35.406 [2024-07-26 05:54:50.088327] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:35.406 [2024-07-26 05:54:50.088374] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:35.406 [2024-07-26 05:54:50.088459] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xcc98a0 00:26:35.406 [2024-07-26 05:54:50.088470] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:35.406 [2024-07-26 05:54:50.088701] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd43e10 00:26:35.406 [2024-07-26 05:54:50.088861] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcc98a0 00:26:35.406 [2024-07-26 05:54:50.088872] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcc98a0 00:26:35.406 [2024-07-26 05:54:50.088999] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:35.406 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:35.407 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.407 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.700 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:35.700 "name": "raid_bdev1", 00:26:35.700 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:35.700 "strip_size_kb": 0, 00:26:35.700 "state": "online", 00:26:35.700 "raid_level": "raid1", 00:26:35.700 "superblock": false, 00:26:35.700 "num_base_bdevs": 4, 00:26:35.700 "num_base_bdevs_discovered": 4, 00:26:35.700 "num_base_bdevs_operational": 4, 00:26:35.700 "base_bdevs_list": [ 00:26:35.700 { 00:26:35.700 "name": "BaseBdev1", 00:26:35.700 "uuid": "5aff4d7e-4045-575e-8370-02d72c0ae4f8", 00:26:35.700 "is_configured": true, 00:26:35.700 "data_offset": 0, 00:26:35.700 "data_size": 65536 00:26:35.700 }, 00:26:35.700 { 00:26:35.700 "name": "BaseBdev2", 00:26:35.700 "uuid": "3fab28e3-e453-5108-9def-a7ec21cfc45a", 00:26:35.700 "is_configured": true, 00:26:35.700 "data_offset": 0, 00:26:35.700 "data_size": 65536 00:26:35.700 }, 00:26:35.700 { 00:26:35.700 "name": "BaseBdev3", 00:26:35.700 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:35.700 "is_configured": true, 00:26:35.700 "data_offset": 0, 00:26:35.700 "data_size": 65536 00:26:35.700 }, 00:26:35.700 { 00:26:35.700 "name": "BaseBdev4", 00:26:35.700 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:35.700 "is_configured": true, 00:26:35.700 "data_offset": 0, 00:26:35.700 "data_size": 65536 00:26:35.700 } 00:26:35.700 ] 00:26:35.700 }' 00:26:35.700 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:35.700 05:54:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:36.268 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:36.268 05:54:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:36.527 [2024-07-26 05:54:51.198105] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:36.527 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:26:36.527 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.527 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:36.786 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:26:36.786 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:36.786 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:36.786 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:36.786 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:36.786 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:36.786 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:36.787 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:36.787 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:36.787 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:36.787 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:26:36.787 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:36.787 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:36.787 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:36.787 [2024-07-26 05:54:51.687139] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd43e10 00:26:37.045 /dev/nbd0 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:37.045 1+0 records in 00:26:37.045 1+0 records out 00:26:37.045 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189951 s, 21.6 MB/s 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:37.045 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:37.046 05:54:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:26:37.046 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:37.046 05:54:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:37.046 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:37.046 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:37.046 05:54:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:26:45.163 65536+0 records in 00:26:45.163 65536+0 records out 00:26:45.163 33554432 bytes (34 MB, 32 MiB) copied, 8.1099 s, 4.1 MB/s 00:26:45.163 05:54:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:45.163 05:54:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:45.163 05:54:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:45.163 05:54:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:45.163 05:54:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:26:45.163 05:54:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:45.163 05:54:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:45.423 [2024-07-26 05:55:00.130663] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:45.423 05:55:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:45.423 05:55:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:45.423 05:55:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:45.423 05:55:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:45.423 05:55:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:45.423 05:55:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:45.423 05:55:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:45.423 05:55:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:45.424 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:45.683 [2024-07-26 05:55:00.370090] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.683 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.942 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.942 "name": "raid_bdev1", 00:26:45.942 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:45.942 "strip_size_kb": 0, 00:26:45.942 "state": "online", 00:26:45.942 "raid_level": "raid1", 00:26:45.942 "superblock": false, 00:26:45.942 "num_base_bdevs": 4, 00:26:45.942 "num_base_bdevs_discovered": 3, 00:26:45.942 "num_base_bdevs_operational": 3, 00:26:45.942 "base_bdevs_list": [ 00:26:45.942 { 00:26:45.942 "name": null, 00:26:45.942 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.942 "is_configured": false, 00:26:45.942 "data_offset": 0, 00:26:45.942 "data_size": 65536 00:26:45.942 }, 00:26:45.942 { 00:26:45.942 "name": "BaseBdev2", 00:26:45.942 "uuid": "3fab28e3-e453-5108-9def-a7ec21cfc45a", 00:26:45.942 "is_configured": true, 00:26:45.942 "data_offset": 0, 00:26:45.942 "data_size": 65536 00:26:45.942 }, 00:26:45.942 { 00:26:45.942 "name": "BaseBdev3", 00:26:45.942 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:45.942 "is_configured": true, 00:26:45.942 "data_offset": 0, 00:26:45.942 "data_size": 65536 00:26:45.942 }, 00:26:45.942 { 00:26:45.942 "name": "BaseBdev4", 00:26:45.942 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:45.942 "is_configured": true, 00:26:45.942 "data_offset": 0, 00:26:45.942 "data_size": 65536 00:26:45.942 } 00:26:45.943 ] 00:26:45.943 }' 00:26:45.943 05:55:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.943 05:55:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:46.510 05:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:46.769 [2024-07-26 05:55:01.436928] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:46.769 [2024-07-26 05:55:01.441026] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd43e10 00:26:46.769 [2024-07-26 05:55:01.443401] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:46.769 05:55:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:47.706 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:47.706 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:47.706 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:47.706 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:47.706 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:47.706 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.706 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:47.965 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:47.965 "name": "raid_bdev1", 00:26:47.965 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:47.965 "strip_size_kb": 0, 00:26:47.965 "state": "online", 00:26:47.965 "raid_level": "raid1", 00:26:47.965 "superblock": false, 00:26:47.965 "num_base_bdevs": 4, 00:26:47.965 "num_base_bdevs_discovered": 4, 00:26:47.965 "num_base_bdevs_operational": 4, 00:26:47.965 "process": { 00:26:47.965 "type": "rebuild", 00:26:47.965 "target": "spare", 00:26:47.965 "progress": { 00:26:47.965 "blocks": 24576, 00:26:47.965 "percent": 37 00:26:47.965 } 00:26:47.965 }, 00:26:47.965 "base_bdevs_list": [ 00:26:47.965 { 00:26:47.965 "name": "spare", 00:26:47.965 "uuid": "9714f3d2-8fc4-5f56-8c4b-a99edddb2d14", 00:26:47.965 "is_configured": true, 00:26:47.965 "data_offset": 0, 00:26:47.965 "data_size": 65536 00:26:47.965 }, 00:26:47.965 { 00:26:47.965 "name": "BaseBdev2", 00:26:47.965 "uuid": "3fab28e3-e453-5108-9def-a7ec21cfc45a", 00:26:47.965 "is_configured": true, 00:26:47.965 "data_offset": 0, 00:26:47.965 "data_size": 65536 00:26:47.965 }, 00:26:47.965 { 00:26:47.965 "name": "BaseBdev3", 00:26:47.965 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:47.965 "is_configured": true, 00:26:47.965 "data_offset": 0, 00:26:47.965 "data_size": 65536 00:26:47.965 }, 00:26:47.965 { 00:26:47.965 "name": "BaseBdev4", 00:26:47.965 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:47.965 "is_configured": true, 00:26:47.965 "data_offset": 0, 00:26:47.965 "data_size": 65536 00:26:47.965 } 00:26:47.965 ] 00:26:47.965 }' 00:26:47.965 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:47.965 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:47.965 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:47.965 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:47.965 05:55:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:48.224 [2024-07-26 05:55:03.028733] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:48.224 [2024-07-26 05:55:03.055999] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:48.224 [2024-07-26 05:55:03.056045] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:48.224 [2024-07-26 05:55:03.056063] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:48.224 [2024-07-26 05:55:03.056071] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.224 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.481 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:48.481 "name": "raid_bdev1", 00:26:48.481 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:48.481 "strip_size_kb": 0, 00:26:48.482 "state": "online", 00:26:48.482 "raid_level": "raid1", 00:26:48.482 "superblock": false, 00:26:48.482 "num_base_bdevs": 4, 00:26:48.482 "num_base_bdevs_discovered": 3, 00:26:48.482 "num_base_bdevs_operational": 3, 00:26:48.482 "base_bdevs_list": [ 00:26:48.482 { 00:26:48.482 "name": null, 00:26:48.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.482 "is_configured": false, 00:26:48.482 "data_offset": 0, 00:26:48.482 "data_size": 65536 00:26:48.482 }, 00:26:48.482 { 00:26:48.482 "name": "BaseBdev2", 00:26:48.482 "uuid": "3fab28e3-e453-5108-9def-a7ec21cfc45a", 00:26:48.482 "is_configured": true, 00:26:48.482 "data_offset": 0, 00:26:48.482 "data_size": 65536 00:26:48.482 }, 00:26:48.482 { 00:26:48.482 "name": "BaseBdev3", 00:26:48.482 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:48.482 "is_configured": true, 00:26:48.482 "data_offset": 0, 00:26:48.482 "data_size": 65536 00:26:48.482 }, 00:26:48.482 { 00:26:48.482 "name": "BaseBdev4", 00:26:48.482 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:48.482 "is_configured": true, 00:26:48.482 "data_offset": 0, 00:26:48.482 "data_size": 65536 00:26:48.482 } 00:26:48.482 ] 00:26:48.482 }' 00:26:48.482 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:48.482 05:55:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:49.046 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:49.046 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:49.046 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:49.046 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:49.046 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:49.046 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.046 05:55:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:49.305 05:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:49.305 "name": "raid_bdev1", 00:26:49.305 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:49.305 "strip_size_kb": 0, 00:26:49.305 "state": "online", 00:26:49.305 "raid_level": "raid1", 00:26:49.305 "superblock": false, 00:26:49.305 "num_base_bdevs": 4, 00:26:49.305 "num_base_bdevs_discovered": 3, 00:26:49.305 "num_base_bdevs_operational": 3, 00:26:49.305 "base_bdevs_list": [ 00:26:49.305 { 00:26:49.305 "name": null, 00:26:49.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:49.305 "is_configured": false, 00:26:49.305 "data_offset": 0, 00:26:49.305 "data_size": 65536 00:26:49.305 }, 00:26:49.305 { 00:26:49.305 "name": "BaseBdev2", 00:26:49.305 "uuid": "3fab28e3-e453-5108-9def-a7ec21cfc45a", 00:26:49.305 "is_configured": true, 00:26:49.305 "data_offset": 0, 00:26:49.305 "data_size": 65536 00:26:49.305 }, 00:26:49.305 { 00:26:49.305 "name": "BaseBdev3", 00:26:49.305 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:49.305 "is_configured": true, 00:26:49.305 "data_offset": 0, 00:26:49.305 "data_size": 65536 00:26:49.305 }, 00:26:49.305 { 00:26:49.305 "name": "BaseBdev4", 00:26:49.305 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:49.305 "is_configured": true, 00:26:49.305 "data_offset": 0, 00:26:49.305 "data_size": 65536 00:26:49.305 } 00:26:49.305 ] 00:26:49.305 }' 00:26:49.305 05:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:49.305 05:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:49.564 05:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:49.564 05:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:49.564 05:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:49.822 [2024-07-26 05:55:04.479922] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:49.822 [2024-07-26 05:55:04.483985] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xccf6b0 00:26:49.822 [2024-07-26 05:55:04.485484] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:49.822 05:55:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:50.756 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:50.756 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:50.756 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:50.756 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:50.756 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:50.756 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.756 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.015 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:51.015 "name": "raid_bdev1", 00:26:51.015 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:51.015 "strip_size_kb": 0, 00:26:51.015 "state": "online", 00:26:51.015 "raid_level": "raid1", 00:26:51.015 "superblock": false, 00:26:51.015 "num_base_bdevs": 4, 00:26:51.015 "num_base_bdevs_discovered": 4, 00:26:51.015 "num_base_bdevs_operational": 4, 00:26:51.015 "process": { 00:26:51.015 "type": "rebuild", 00:26:51.015 "target": "spare", 00:26:51.015 "progress": { 00:26:51.015 "blocks": 24576, 00:26:51.015 "percent": 37 00:26:51.015 } 00:26:51.015 }, 00:26:51.015 "base_bdevs_list": [ 00:26:51.015 { 00:26:51.015 "name": "spare", 00:26:51.015 "uuid": "9714f3d2-8fc4-5f56-8c4b-a99edddb2d14", 00:26:51.015 "is_configured": true, 00:26:51.015 "data_offset": 0, 00:26:51.015 "data_size": 65536 00:26:51.015 }, 00:26:51.015 { 00:26:51.015 "name": "BaseBdev2", 00:26:51.015 "uuid": "3fab28e3-e453-5108-9def-a7ec21cfc45a", 00:26:51.015 "is_configured": true, 00:26:51.015 "data_offset": 0, 00:26:51.015 "data_size": 65536 00:26:51.015 }, 00:26:51.015 { 00:26:51.015 "name": "BaseBdev3", 00:26:51.015 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:51.015 "is_configured": true, 00:26:51.015 "data_offset": 0, 00:26:51.015 "data_size": 65536 00:26:51.015 }, 00:26:51.015 { 00:26:51.015 "name": "BaseBdev4", 00:26:51.015 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:51.015 "is_configured": true, 00:26:51.015 "data_offset": 0, 00:26:51.015 "data_size": 65536 00:26:51.015 } 00:26:51.015 ] 00:26:51.015 }' 00:26:51.015 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:51.015 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:51.015 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:51.015 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:51.015 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:26:51.015 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:51.016 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:51.016 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:51.016 05:55:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:51.274 [2024-07-26 05:55:06.057984] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:51.274 [2024-07-26 05:55:06.098047] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xccf6b0 00:26:51.274 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:51.274 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:51.274 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:51.274 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.274 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:51.274 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:51.274 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.274 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.274 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.533 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:51.533 "name": "raid_bdev1", 00:26:51.533 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:51.533 "strip_size_kb": 0, 00:26:51.533 "state": "online", 00:26:51.533 "raid_level": "raid1", 00:26:51.533 "superblock": false, 00:26:51.533 "num_base_bdevs": 4, 00:26:51.533 "num_base_bdevs_discovered": 3, 00:26:51.533 "num_base_bdevs_operational": 3, 00:26:51.533 "process": { 00:26:51.533 "type": "rebuild", 00:26:51.533 "target": "spare", 00:26:51.533 "progress": { 00:26:51.533 "blocks": 36864, 00:26:51.533 "percent": 56 00:26:51.533 } 00:26:51.533 }, 00:26:51.533 "base_bdevs_list": [ 00:26:51.533 { 00:26:51.533 "name": "spare", 00:26:51.533 "uuid": "9714f3d2-8fc4-5f56-8c4b-a99edddb2d14", 00:26:51.533 "is_configured": true, 00:26:51.533 "data_offset": 0, 00:26:51.533 "data_size": 65536 00:26:51.533 }, 00:26:51.533 { 00:26:51.533 "name": null, 00:26:51.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:51.533 "is_configured": false, 00:26:51.533 "data_offset": 0, 00:26:51.533 "data_size": 65536 00:26:51.533 }, 00:26:51.533 { 00:26:51.533 "name": "BaseBdev3", 00:26:51.533 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:51.533 "is_configured": true, 00:26:51.533 "data_offset": 0, 00:26:51.533 "data_size": 65536 00:26:51.533 }, 00:26:51.533 { 00:26:51.533 "name": "BaseBdev4", 00:26:51.533 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:51.533 "is_configured": true, 00:26:51.533 "data_offset": 0, 00:26:51.533 "data_size": 65536 00:26:51.533 } 00:26:51.533 ] 00:26:51.533 }' 00:26:51.533 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:51.533 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:51.533 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=876 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.793 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.051 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:52.051 "name": "raid_bdev1", 00:26:52.051 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:52.051 "strip_size_kb": 0, 00:26:52.051 "state": "online", 00:26:52.051 "raid_level": "raid1", 00:26:52.051 "superblock": false, 00:26:52.051 "num_base_bdevs": 4, 00:26:52.051 "num_base_bdevs_discovered": 3, 00:26:52.051 "num_base_bdevs_operational": 3, 00:26:52.051 "process": { 00:26:52.051 "type": "rebuild", 00:26:52.051 "target": "spare", 00:26:52.051 "progress": { 00:26:52.051 "blocks": 45056, 00:26:52.051 "percent": 68 00:26:52.051 } 00:26:52.051 }, 00:26:52.051 "base_bdevs_list": [ 00:26:52.051 { 00:26:52.051 "name": "spare", 00:26:52.051 "uuid": "9714f3d2-8fc4-5f56-8c4b-a99edddb2d14", 00:26:52.051 "is_configured": true, 00:26:52.051 "data_offset": 0, 00:26:52.051 "data_size": 65536 00:26:52.051 }, 00:26:52.051 { 00:26:52.051 "name": null, 00:26:52.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.051 "is_configured": false, 00:26:52.051 "data_offset": 0, 00:26:52.051 "data_size": 65536 00:26:52.051 }, 00:26:52.051 { 00:26:52.051 "name": "BaseBdev3", 00:26:52.051 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:52.052 "is_configured": true, 00:26:52.052 "data_offset": 0, 00:26:52.052 "data_size": 65536 00:26:52.052 }, 00:26:52.052 { 00:26:52.052 "name": "BaseBdev4", 00:26:52.052 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:52.052 "is_configured": true, 00:26:52.052 "data_offset": 0, 00:26:52.052 "data_size": 65536 00:26:52.052 } 00:26:52.052 ] 00:26:52.052 }' 00:26:52.052 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:52.052 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:52.052 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:52.052 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:52.052 05:55:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:52.986 [2024-07-26 05:55:07.710600] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:52.986 [2024-07-26 05:55:07.710664] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:52.986 [2024-07-26 05:55:07.710702] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:52.986 05:55:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:52.986 05:55:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:52.986 05:55:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:52.986 05:55:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:52.986 05:55:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:52.986 05:55:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:52.986 05:55:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.986 05:55:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.244 "name": "raid_bdev1", 00:26:53.244 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:53.244 "strip_size_kb": 0, 00:26:53.244 "state": "online", 00:26:53.244 "raid_level": "raid1", 00:26:53.244 "superblock": false, 00:26:53.244 "num_base_bdevs": 4, 00:26:53.244 "num_base_bdevs_discovered": 3, 00:26:53.244 "num_base_bdevs_operational": 3, 00:26:53.244 "base_bdevs_list": [ 00:26:53.244 { 00:26:53.244 "name": "spare", 00:26:53.244 "uuid": "9714f3d2-8fc4-5f56-8c4b-a99edddb2d14", 00:26:53.244 "is_configured": true, 00:26:53.244 "data_offset": 0, 00:26:53.244 "data_size": 65536 00:26:53.244 }, 00:26:53.244 { 00:26:53.244 "name": null, 00:26:53.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.244 "is_configured": false, 00:26:53.244 "data_offset": 0, 00:26:53.244 "data_size": 65536 00:26:53.244 }, 00:26:53.244 { 00:26:53.244 "name": "BaseBdev3", 00:26:53.244 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:53.244 "is_configured": true, 00:26:53.244 "data_offset": 0, 00:26:53.244 "data_size": 65536 00:26:53.244 }, 00:26:53.244 { 00:26:53.244 "name": "BaseBdev4", 00:26:53.244 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:53.244 "is_configured": true, 00:26:53.244 "data_offset": 0, 00:26:53.244 "data_size": 65536 00:26:53.244 } 00:26:53.244 ] 00:26:53.244 }' 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.244 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.810 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.810 "name": "raid_bdev1", 00:26:53.810 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:53.810 "strip_size_kb": 0, 00:26:53.810 "state": "online", 00:26:53.811 "raid_level": "raid1", 00:26:53.811 "superblock": false, 00:26:53.811 "num_base_bdevs": 4, 00:26:53.811 "num_base_bdevs_discovered": 3, 00:26:53.811 "num_base_bdevs_operational": 3, 00:26:53.811 "base_bdevs_list": [ 00:26:53.811 { 00:26:53.811 "name": "spare", 00:26:53.811 "uuid": "9714f3d2-8fc4-5f56-8c4b-a99edddb2d14", 00:26:53.811 "is_configured": true, 00:26:53.811 "data_offset": 0, 00:26:53.811 "data_size": 65536 00:26:53.811 }, 00:26:53.811 { 00:26:53.811 "name": null, 00:26:53.811 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.811 "is_configured": false, 00:26:53.811 "data_offset": 0, 00:26:53.811 "data_size": 65536 00:26:53.811 }, 00:26:53.811 { 00:26:53.811 "name": "BaseBdev3", 00:26:53.811 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:53.811 "is_configured": true, 00:26:53.811 "data_offset": 0, 00:26:53.811 "data_size": 65536 00:26:53.811 }, 00:26:53.811 { 00:26:53.811 "name": "BaseBdev4", 00:26:53.811 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:53.811 "is_configured": true, 00:26:53.811 "data_offset": 0, 00:26:53.811 "data_size": 65536 00:26:53.811 } 00:26:53.811 ] 00:26:53.811 }' 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:53.811 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:54.069 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.069 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:54.069 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:54.069 "name": "raid_bdev1", 00:26:54.069 "uuid": "79fcad42-8b9d-4f38-8337-2de18fc93195", 00:26:54.069 "strip_size_kb": 0, 00:26:54.069 "state": "online", 00:26:54.069 "raid_level": "raid1", 00:26:54.069 "superblock": false, 00:26:54.069 "num_base_bdevs": 4, 00:26:54.069 "num_base_bdevs_discovered": 3, 00:26:54.069 "num_base_bdevs_operational": 3, 00:26:54.069 "base_bdevs_list": [ 00:26:54.069 { 00:26:54.069 "name": "spare", 00:26:54.069 "uuid": "9714f3d2-8fc4-5f56-8c4b-a99edddb2d14", 00:26:54.069 "is_configured": true, 00:26:54.069 "data_offset": 0, 00:26:54.069 "data_size": 65536 00:26:54.069 }, 00:26:54.069 { 00:26:54.069 "name": null, 00:26:54.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.069 "is_configured": false, 00:26:54.069 "data_offset": 0, 00:26:54.069 "data_size": 65536 00:26:54.069 }, 00:26:54.069 { 00:26:54.069 "name": "BaseBdev3", 00:26:54.069 "uuid": "ff0116b5-0ba8-5bae-ad3e-e91fbb726d9c", 00:26:54.069 "is_configured": true, 00:26:54.069 "data_offset": 0, 00:26:54.069 "data_size": 65536 00:26:54.069 }, 00:26:54.069 { 00:26:54.069 "name": "BaseBdev4", 00:26:54.069 "uuid": "14fe8420-6d79-5021-bc04-ce4d4a1e9227", 00:26:54.069 "is_configured": true, 00:26:54.069 "data_offset": 0, 00:26:54.069 "data_size": 65536 00:26:54.069 } 00:26:54.069 ] 00:26:54.069 }' 00:26:54.069 05:55:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:54.069 05:55:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:54.636 05:55:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:54.895 [2024-07-26 05:55:09.735644] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:54.895 [2024-07-26 05:55:09.735671] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:54.895 [2024-07-26 05:55:09.735736] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:54.895 [2024-07-26 05:55:09.735809] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:54.895 [2024-07-26 05:55:09.735821] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc98a0 name raid_bdev1, state offline 00:26:54.895 05:55:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.895 05:55:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:55.153 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:55.154 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:55.442 /dev/nbd0 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:55.442 1+0 records in 00:26:55.442 1+0 records out 00:26:55.442 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225927 s, 18.1 MB/s 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:55.442 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:55.709 /dev/nbd1 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:55.709 1+0 records in 00:26:55.709 1+0 records out 00:26:55.709 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000347198 s, 11.8 MB/s 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:55.709 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:55.710 05:55:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:26:55.710 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:55.710 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:55.710 05:55:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:55.968 05:55:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:55.968 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:55.968 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:55.968 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:55.968 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:26:55.968 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:55.968 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:56.227 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:56.227 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:56.227 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:56.227 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:56.228 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:56.228 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:56.228 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:56.228 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:56.228 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:56.228 05:55:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1249102 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1249102 ']' 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1249102 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1249102 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1249102' 00:26:56.487 killing process with pid 1249102 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1249102 00:26:56.487 Received shutdown signal, test time was about 60.000000 seconds 00:26:56.487 00:26:56.487 Latency(us) 00:26:56.487 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:56.487 =================================================================================================================== 00:26:56.487 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:56.487 [2024-07-26 05:55:11.265668] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:56.487 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1249102 00:26:56.487 [2024-07-26 05:55:11.315828] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:26:56.746 00:26:56.746 real 0m25.383s 00:26:56.746 user 0m33.718s 00:26:56.746 sys 0m5.792s 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:56.746 ************************************ 00:26:56.746 END TEST raid_rebuild_test 00:26:56.746 ************************************ 00:26:56.746 05:55:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:56.746 05:55:11 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:26:56.746 05:55:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:56.746 05:55:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:56.746 05:55:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:56.746 ************************************ 00:26:56.746 START TEST raid_rebuild_test_sb 00:26:56.746 ************************************ 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1252540 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1252540 /var/tmp/spdk-raid.sock 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1252540 ']' 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:56.746 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:56.746 05:55:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:57.006 [2024-07-26 05:55:11.707070] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:26:57.006 [2024-07-26 05:55:11.707142] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1252540 ] 00:26:57.006 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:57.006 Zero copy mechanism will not be used. 00:26:57.006 [2024-07-26 05:55:11.836503] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:57.264 [2024-07-26 05:55:11.940294] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:57.264 [2024-07-26 05:55:11.999540] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:57.264 [2024-07-26 05:55:11.999574] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:57.831 05:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:57.831 05:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:26:57.831 05:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:57.831 05:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:58.089 BaseBdev1_malloc 00:26:58.089 05:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:58.347 [2024-07-26 05:55:13.115875] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:58.347 [2024-07-26 05:55:13.115925] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:58.347 [2024-07-26 05:55:13.115949] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x171cd40 00:26:58.347 [2024-07-26 05:55:13.115961] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:58.347 [2024-07-26 05:55:13.117529] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:58.347 [2024-07-26 05:55:13.117558] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:58.347 BaseBdev1 00:26:58.347 05:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:58.347 05:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:58.623 BaseBdev2_malloc 00:26:58.623 05:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:58.883 [2024-07-26 05:55:13.610028] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:58.883 [2024-07-26 05:55:13.610074] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:58.883 [2024-07-26 05:55:13.610097] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x171d860 00:26:58.883 [2024-07-26 05:55:13.610110] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:58.883 [2024-07-26 05:55:13.611491] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:58.883 [2024-07-26 05:55:13.611517] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:58.883 BaseBdev2 00:26:58.883 05:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:58.883 05:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:59.142 BaseBdev3_malloc 00:26:59.142 05:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:59.400 [2024-07-26 05:55:14.103948] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:59.400 [2024-07-26 05:55:14.103998] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:59.400 [2024-07-26 05:55:14.104020] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18ca8f0 00:26:59.400 [2024-07-26 05:55:14.104032] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:59.400 [2024-07-26 05:55:14.105467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:59.400 [2024-07-26 05:55:14.105495] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:59.400 BaseBdev3 00:26:59.400 05:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:59.400 05:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:59.661 BaseBdev4_malloc 00:26:59.661 05:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:59.919 [2024-07-26 05:55:14.601818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:59.919 [2024-07-26 05:55:14.601868] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:59.919 [2024-07-26 05:55:14.601890] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18c9ad0 00:26:59.919 [2024-07-26 05:55:14.601902] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:59.919 [2024-07-26 05:55:14.603396] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:59.919 [2024-07-26 05:55:14.603425] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:59.919 BaseBdev4 00:26:59.919 05:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:00.177 spare_malloc 00:27:00.177 05:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:00.435 spare_delay 00:27:00.435 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:00.693 [2024-07-26 05:55:15.344344] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:00.693 [2024-07-26 05:55:15.344390] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:00.693 [2024-07-26 05:55:15.344410] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18ce5b0 00:27:00.693 [2024-07-26 05:55:15.344422] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:00.693 [2024-07-26 05:55:15.345841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:00.693 [2024-07-26 05:55:15.345868] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:00.693 spare 00:27:00.693 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:00.693 [2024-07-26 05:55:15.589031] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:00.693 [2024-07-26 05:55:15.590188] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:00.693 [2024-07-26 05:55:15.590244] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:00.693 [2024-07-26 05:55:15.590289] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:00.693 [2024-07-26 05:55:15.590485] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x184d8a0 00:27:00.693 [2024-07-26 05:55:15.590496] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:00.693 [2024-07-26 05:55:15.590690] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18c7e10 00:27:00.693 [2024-07-26 05:55:15.590837] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x184d8a0 00:27:00.693 [2024-07-26 05:55:15.590848] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x184d8a0 00:27:00.693 [2024-07-26 05:55:15.590936] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.951 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:01.210 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:01.210 "name": "raid_bdev1", 00:27:01.210 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:01.210 "strip_size_kb": 0, 00:27:01.210 "state": "online", 00:27:01.210 "raid_level": "raid1", 00:27:01.210 "superblock": true, 00:27:01.210 "num_base_bdevs": 4, 00:27:01.210 "num_base_bdevs_discovered": 4, 00:27:01.210 "num_base_bdevs_operational": 4, 00:27:01.210 "base_bdevs_list": [ 00:27:01.210 { 00:27:01.210 "name": "BaseBdev1", 00:27:01.210 "uuid": "69172d47-989d-538f-8a35-70af7689d5be", 00:27:01.210 "is_configured": true, 00:27:01.210 "data_offset": 2048, 00:27:01.210 "data_size": 63488 00:27:01.210 }, 00:27:01.210 { 00:27:01.210 "name": "BaseBdev2", 00:27:01.210 "uuid": "54eccad7-eed1-5134-8352-ec80d8f0559d", 00:27:01.210 "is_configured": true, 00:27:01.210 "data_offset": 2048, 00:27:01.210 "data_size": 63488 00:27:01.210 }, 00:27:01.210 { 00:27:01.210 "name": "BaseBdev3", 00:27:01.210 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:01.210 "is_configured": true, 00:27:01.210 "data_offset": 2048, 00:27:01.210 "data_size": 63488 00:27:01.210 }, 00:27:01.210 { 00:27:01.210 "name": "BaseBdev4", 00:27:01.210 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:01.210 "is_configured": true, 00:27:01.210 "data_offset": 2048, 00:27:01.210 "data_size": 63488 00:27:01.210 } 00:27:01.210 ] 00:27:01.210 }' 00:27:01.210 05:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:01.210 05:55:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:02.146 05:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:02.146 05:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:02.146 [2024-07-26 05:55:16.932886] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:02.146 05:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:27:02.146 05:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:02.146 05:55:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:02.715 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:02.973 [2024-07-26 05:55:17.702727] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18c7e10 00:27:02.973 /dev/nbd0 00:27:02.973 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:02.973 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:02.973 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:02.973 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:27:02.973 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:02.973 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:02.974 1+0 records in 00:27:02.974 1+0 records out 00:27:02.974 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244613 s, 16.7 MB/s 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:02.974 05:55:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:27:09.540 63488+0 records in 00:27:09.540 63488+0 records out 00:27:09.540 32505856 bytes (33 MB, 31 MiB) copied, 6.44081 s, 5.0 MB/s 00:27:09.540 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:09.540 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:09.540 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:09.540 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:09.540 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:27:09.540 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:09.540 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:09.799 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:09.799 [2024-07-26 05:55:24.462382] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:09.799 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:09.799 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:09.799 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:09.799 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:09.799 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:09.799 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:27:09.799 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:27:09.799 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:09.799 [2024-07-26 05:55:24.691045] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:10.058 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.058 "name": "raid_bdev1", 00:27:10.058 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:10.058 "strip_size_kb": 0, 00:27:10.058 "state": "online", 00:27:10.058 "raid_level": "raid1", 00:27:10.058 "superblock": true, 00:27:10.058 "num_base_bdevs": 4, 00:27:10.058 "num_base_bdevs_discovered": 3, 00:27:10.058 "num_base_bdevs_operational": 3, 00:27:10.058 "base_bdevs_list": [ 00:27:10.058 { 00:27:10.058 "name": null, 00:27:10.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:10.058 "is_configured": false, 00:27:10.058 "data_offset": 2048, 00:27:10.058 "data_size": 63488 00:27:10.058 }, 00:27:10.058 { 00:27:10.058 "name": "BaseBdev2", 00:27:10.058 "uuid": "54eccad7-eed1-5134-8352-ec80d8f0559d", 00:27:10.058 "is_configured": true, 00:27:10.058 "data_offset": 2048, 00:27:10.058 "data_size": 63488 00:27:10.058 }, 00:27:10.058 { 00:27:10.058 "name": "BaseBdev3", 00:27:10.058 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:10.058 "is_configured": true, 00:27:10.058 "data_offset": 2048, 00:27:10.058 "data_size": 63488 00:27:10.058 }, 00:27:10.058 { 00:27:10.058 "name": "BaseBdev4", 00:27:10.058 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:10.058 "is_configured": true, 00:27:10.058 "data_offset": 2048, 00:27:10.058 "data_size": 63488 00:27:10.059 } 00:27:10.059 ] 00:27:10.059 }' 00:27:10.059 05:55:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.059 05:55:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:10.996 05:55:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:11.255 [2024-07-26 05:55:26.066707] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:11.255 [2024-07-26 05:55:26.070871] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18c7e10 00:27:11.255 [2024-07-26 05:55:26.073273] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:11.255 05:55:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:12.193 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:12.193 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:12.193 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:12.193 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:12.193 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:12.193 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.193 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.452 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:12.452 "name": "raid_bdev1", 00:27:12.452 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:12.452 "strip_size_kb": 0, 00:27:12.452 "state": "online", 00:27:12.452 "raid_level": "raid1", 00:27:12.452 "superblock": true, 00:27:12.452 "num_base_bdevs": 4, 00:27:12.452 "num_base_bdevs_discovered": 4, 00:27:12.452 "num_base_bdevs_operational": 4, 00:27:12.452 "process": { 00:27:12.452 "type": "rebuild", 00:27:12.452 "target": "spare", 00:27:12.452 "progress": { 00:27:12.452 "blocks": 24576, 00:27:12.452 "percent": 38 00:27:12.452 } 00:27:12.452 }, 00:27:12.452 "base_bdevs_list": [ 00:27:12.452 { 00:27:12.452 "name": "spare", 00:27:12.452 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:12.452 "is_configured": true, 00:27:12.452 "data_offset": 2048, 00:27:12.452 "data_size": 63488 00:27:12.452 }, 00:27:12.452 { 00:27:12.452 "name": "BaseBdev2", 00:27:12.452 "uuid": "54eccad7-eed1-5134-8352-ec80d8f0559d", 00:27:12.452 "is_configured": true, 00:27:12.452 "data_offset": 2048, 00:27:12.452 "data_size": 63488 00:27:12.452 }, 00:27:12.452 { 00:27:12.452 "name": "BaseBdev3", 00:27:12.452 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:12.452 "is_configured": true, 00:27:12.452 "data_offset": 2048, 00:27:12.452 "data_size": 63488 00:27:12.452 }, 00:27:12.452 { 00:27:12.452 "name": "BaseBdev4", 00:27:12.452 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:12.452 "is_configured": true, 00:27:12.452 "data_offset": 2048, 00:27:12.452 "data_size": 63488 00:27:12.452 } 00:27:12.452 ] 00:27:12.452 }' 00:27:12.452 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:12.712 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:12.712 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:12.712 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:12.712 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:13.079 [2024-07-26 05:55:27.668624] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:13.079 [2024-07-26 05:55:27.686108] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:13.079 [2024-07-26 05:55:27.686151] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:13.079 [2024-07-26 05:55:27.686168] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:13.079 [2024-07-26 05:55:27.686176] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.079 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:13.080 "name": "raid_bdev1", 00:27:13.080 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:13.080 "strip_size_kb": 0, 00:27:13.080 "state": "online", 00:27:13.080 "raid_level": "raid1", 00:27:13.080 "superblock": true, 00:27:13.080 "num_base_bdevs": 4, 00:27:13.080 "num_base_bdevs_discovered": 3, 00:27:13.080 "num_base_bdevs_operational": 3, 00:27:13.080 "base_bdevs_list": [ 00:27:13.080 { 00:27:13.080 "name": null, 00:27:13.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.080 "is_configured": false, 00:27:13.080 "data_offset": 2048, 00:27:13.080 "data_size": 63488 00:27:13.080 }, 00:27:13.080 { 00:27:13.080 "name": "BaseBdev2", 00:27:13.080 "uuid": "54eccad7-eed1-5134-8352-ec80d8f0559d", 00:27:13.080 "is_configured": true, 00:27:13.080 "data_offset": 2048, 00:27:13.080 "data_size": 63488 00:27:13.080 }, 00:27:13.080 { 00:27:13.080 "name": "BaseBdev3", 00:27:13.080 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:13.080 "is_configured": true, 00:27:13.080 "data_offset": 2048, 00:27:13.080 "data_size": 63488 00:27:13.080 }, 00:27:13.080 { 00:27:13.080 "name": "BaseBdev4", 00:27:13.080 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:13.080 "is_configured": true, 00:27:13.080 "data_offset": 2048, 00:27:13.080 "data_size": 63488 00:27:13.080 } 00:27:13.080 ] 00:27:13.080 }' 00:27:13.080 05:55:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:13.080 05:55:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:13.666 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:13.666 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:13.666 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:13.666 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:13.666 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:13.666 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.666 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.925 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:13.925 "name": "raid_bdev1", 00:27:13.925 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:13.925 "strip_size_kb": 0, 00:27:13.925 "state": "online", 00:27:13.925 "raid_level": "raid1", 00:27:13.925 "superblock": true, 00:27:13.925 "num_base_bdevs": 4, 00:27:13.925 "num_base_bdevs_discovered": 3, 00:27:13.925 "num_base_bdevs_operational": 3, 00:27:13.925 "base_bdevs_list": [ 00:27:13.925 { 00:27:13.925 "name": null, 00:27:13.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.926 "is_configured": false, 00:27:13.926 "data_offset": 2048, 00:27:13.926 "data_size": 63488 00:27:13.926 }, 00:27:13.926 { 00:27:13.926 "name": "BaseBdev2", 00:27:13.926 "uuid": "54eccad7-eed1-5134-8352-ec80d8f0559d", 00:27:13.926 "is_configured": true, 00:27:13.926 "data_offset": 2048, 00:27:13.926 "data_size": 63488 00:27:13.926 }, 00:27:13.926 { 00:27:13.926 "name": "BaseBdev3", 00:27:13.926 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:13.926 "is_configured": true, 00:27:13.926 "data_offset": 2048, 00:27:13.926 "data_size": 63488 00:27:13.926 }, 00:27:13.926 { 00:27:13.926 "name": "BaseBdev4", 00:27:13.926 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:13.926 "is_configured": true, 00:27:13.926 "data_offset": 2048, 00:27:13.926 "data_size": 63488 00:27:13.926 } 00:27:13.926 ] 00:27:13.926 }' 00:27:13.926 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:13.926 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:13.926 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:14.185 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:14.185 05:55:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:14.185 [2024-07-26 05:55:29.082484] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:14.185 [2024-07-26 05:55:29.086830] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18c6f40 00:27:14.185 [2024-07-26 05:55:29.088391] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:14.444 05:55:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:15.382 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:15.382 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:15.382 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:15.382 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:15.382 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:15.382 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.382 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:15.642 "name": "raid_bdev1", 00:27:15.642 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:15.642 "strip_size_kb": 0, 00:27:15.642 "state": "online", 00:27:15.642 "raid_level": "raid1", 00:27:15.642 "superblock": true, 00:27:15.642 "num_base_bdevs": 4, 00:27:15.642 "num_base_bdevs_discovered": 4, 00:27:15.642 "num_base_bdevs_operational": 4, 00:27:15.642 "process": { 00:27:15.642 "type": "rebuild", 00:27:15.642 "target": "spare", 00:27:15.642 "progress": { 00:27:15.642 "blocks": 24576, 00:27:15.642 "percent": 38 00:27:15.642 } 00:27:15.642 }, 00:27:15.642 "base_bdevs_list": [ 00:27:15.642 { 00:27:15.642 "name": "spare", 00:27:15.642 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:15.642 "is_configured": true, 00:27:15.642 "data_offset": 2048, 00:27:15.642 "data_size": 63488 00:27:15.642 }, 00:27:15.642 { 00:27:15.642 "name": "BaseBdev2", 00:27:15.642 "uuid": "54eccad7-eed1-5134-8352-ec80d8f0559d", 00:27:15.642 "is_configured": true, 00:27:15.642 "data_offset": 2048, 00:27:15.642 "data_size": 63488 00:27:15.642 }, 00:27:15.642 { 00:27:15.642 "name": "BaseBdev3", 00:27:15.642 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:15.642 "is_configured": true, 00:27:15.642 "data_offset": 2048, 00:27:15.642 "data_size": 63488 00:27:15.642 }, 00:27:15.642 { 00:27:15.642 "name": "BaseBdev4", 00:27:15.642 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:15.642 "is_configured": true, 00:27:15.642 "data_offset": 2048, 00:27:15.642 "data_size": 63488 00:27:15.642 } 00:27:15.642 ] 00:27:15.642 }' 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:15.642 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:27:15.642 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:15.902 [2024-07-26 05:55:30.664156] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:15.902 [2024-07-26 05:55:30.801428] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x18c6f40 00:27:16.162 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:27:16.162 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:27:16.162 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:16.162 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:16.162 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:16.162 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:16.162 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:16.162 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.162 05:55:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.162 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:16.162 "name": "raid_bdev1", 00:27:16.162 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:16.162 "strip_size_kb": 0, 00:27:16.162 "state": "online", 00:27:16.162 "raid_level": "raid1", 00:27:16.162 "superblock": true, 00:27:16.162 "num_base_bdevs": 4, 00:27:16.162 "num_base_bdevs_discovered": 3, 00:27:16.162 "num_base_bdevs_operational": 3, 00:27:16.162 "process": { 00:27:16.162 "type": "rebuild", 00:27:16.162 "target": "spare", 00:27:16.162 "progress": { 00:27:16.162 "blocks": 36864, 00:27:16.162 "percent": 58 00:27:16.162 } 00:27:16.162 }, 00:27:16.162 "base_bdevs_list": [ 00:27:16.162 { 00:27:16.162 "name": "spare", 00:27:16.162 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:16.162 "is_configured": true, 00:27:16.162 "data_offset": 2048, 00:27:16.162 "data_size": 63488 00:27:16.162 }, 00:27:16.162 { 00:27:16.162 "name": null, 00:27:16.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.162 "is_configured": false, 00:27:16.162 "data_offset": 2048, 00:27:16.162 "data_size": 63488 00:27:16.162 }, 00:27:16.162 { 00:27:16.162 "name": "BaseBdev3", 00:27:16.162 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:16.162 "is_configured": true, 00:27:16.162 "data_offset": 2048, 00:27:16.162 "data_size": 63488 00:27:16.162 }, 00:27:16.162 { 00:27:16.162 "name": "BaseBdev4", 00:27:16.162 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:16.162 "is_configured": true, 00:27:16.162 "data_offset": 2048, 00:27:16.162 "data_size": 63488 00:27:16.162 } 00:27:16.162 ] 00:27:16.162 }' 00:27:16.163 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=901 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.424 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.682 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:16.682 "name": "raid_bdev1", 00:27:16.682 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:16.682 "strip_size_kb": 0, 00:27:16.682 "state": "online", 00:27:16.682 "raid_level": "raid1", 00:27:16.682 "superblock": true, 00:27:16.682 "num_base_bdevs": 4, 00:27:16.682 "num_base_bdevs_discovered": 3, 00:27:16.682 "num_base_bdevs_operational": 3, 00:27:16.682 "process": { 00:27:16.682 "type": "rebuild", 00:27:16.682 "target": "spare", 00:27:16.682 "progress": { 00:27:16.682 "blocks": 43008, 00:27:16.682 "percent": 67 00:27:16.682 } 00:27:16.682 }, 00:27:16.682 "base_bdevs_list": [ 00:27:16.682 { 00:27:16.682 "name": "spare", 00:27:16.682 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:16.682 "is_configured": true, 00:27:16.682 "data_offset": 2048, 00:27:16.682 "data_size": 63488 00:27:16.682 }, 00:27:16.682 { 00:27:16.682 "name": null, 00:27:16.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.682 "is_configured": false, 00:27:16.682 "data_offset": 2048, 00:27:16.682 "data_size": 63488 00:27:16.682 }, 00:27:16.682 { 00:27:16.682 "name": "BaseBdev3", 00:27:16.682 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:16.682 "is_configured": true, 00:27:16.682 "data_offset": 2048, 00:27:16.682 "data_size": 63488 00:27:16.682 }, 00:27:16.682 { 00:27:16.682 "name": "BaseBdev4", 00:27:16.682 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:16.682 "is_configured": true, 00:27:16.682 "data_offset": 2048, 00:27:16.682 "data_size": 63488 00:27:16.682 } 00:27:16.682 ] 00:27:16.682 }' 00:27:16.682 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:16.682 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:16.682 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:16.682 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:16.682 05:55:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:17.619 [2024-07-26 05:55:32.313466] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:17.619 [2024-07-26 05:55:32.313525] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:17.619 [2024-07-26 05:55:32.313627] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:17.619 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:17.619 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:17.619 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:17.619 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:17.619 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:17.619 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:17.619 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.619 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:17.877 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:17.877 "name": "raid_bdev1", 00:27:17.877 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:17.877 "strip_size_kb": 0, 00:27:17.877 "state": "online", 00:27:17.877 "raid_level": "raid1", 00:27:17.877 "superblock": true, 00:27:17.877 "num_base_bdevs": 4, 00:27:17.877 "num_base_bdevs_discovered": 3, 00:27:17.877 "num_base_bdevs_operational": 3, 00:27:17.877 "base_bdevs_list": [ 00:27:17.877 { 00:27:17.877 "name": "spare", 00:27:17.877 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:17.877 "is_configured": true, 00:27:17.877 "data_offset": 2048, 00:27:17.877 "data_size": 63488 00:27:17.877 }, 00:27:17.877 { 00:27:17.877 "name": null, 00:27:17.877 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:17.877 "is_configured": false, 00:27:17.877 "data_offset": 2048, 00:27:17.877 "data_size": 63488 00:27:17.877 }, 00:27:17.877 { 00:27:17.877 "name": "BaseBdev3", 00:27:17.877 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:17.877 "is_configured": true, 00:27:17.877 "data_offset": 2048, 00:27:17.877 "data_size": 63488 00:27:17.877 }, 00:27:17.877 { 00:27:17.877 "name": "BaseBdev4", 00:27:17.877 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:17.877 "is_configured": true, 00:27:17.877 "data_offset": 2048, 00:27:17.877 "data_size": 63488 00:27:17.877 } 00:27:17.877 ] 00:27:17.877 }' 00:27:17.877 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:17.877 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:17.877 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:18.136 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:18.136 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:27:18.136 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:18.136 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:18.136 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:18.136 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:18.136 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:18.136 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.136 05:55:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.136 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:18.136 "name": "raid_bdev1", 00:27:18.136 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:18.136 "strip_size_kb": 0, 00:27:18.136 "state": "online", 00:27:18.136 "raid_level": "raid1", 00:27:18.136 "superblock": true, 00:27:18.136 "num_base_bdevs": 4, 00:27:18.136 "num_base_bdevs_discovered": 3, 00:27:18.136 "num_base_bdevs_operational": 3, 00:27:18.136 "base_bdevs_list": [ 00:27:18.136 { 00:27:18.136 "name": "spare", 00:27:18.136 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:18.136 "is_configured": true, 00:27:18.136 "data_offset": 2048, 00:27:18.136 "data_size": 63488 00:27:18.136 }, 00:27:18.136 { 00:27:18.136 "name": null, 00:27:18.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.136 "is_configured": false, 00:27:18.136 "data_offset": 2048, 00:27:18.136 "data_size": 63488 00:27:18.136 }, 00:27:18.136 { 00:27:18.136 "name": "BaseBdev3", 00:27:18.136 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:18.136 "is_configured": true, 00:27:18.136 "data_offset": 2048, 00:27:18.136 "data_size": 63488 00:27:18.136 }, 00:27:18.136 { 00:27:18.136 "name": "BaseBdev4", 00:27:18.136 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:18.136 "is_configured": true, 00:27:18.136 "data_offset": 2048, 00:27:18.136 "data_size": 63488 00:27:18.136 } 00:27:18.136 ] 00:27:18.136 }' 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:18.395 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:18.396 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:18.396 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.396 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.665 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:18.665 "name": "raid_bdev1", 00:27:18.665 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:18.665 "strip_size_kb": 0, 00:27:18.665 "state": "online", 00:27:18.665 "raid_level": "raid1", 00:27:18.665 "superblock": true, 00:27:18.665 "num_base_bdevs": 4, 00:27:18.665 "num_base_bdevs_discovered": 3, 00:27:18.665 "num_base_bdevs_operational": 3, 00:27:18.665 "base_bdevs_list": [ 00:27:18.665 { 00:27:18.665 "name": "spare", 00:27:18.665 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:18.665 "is_configured": true, 00:27:18.665 "data_offset": 2048, 00:27:18.665 "data_size": 63488 00:27:18.665 }, 00:27:18.665 { 00:27:18.665 "name": null, 00:27:18.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.665 "is_configured": false, 00:27:18.665 "data_offset": 2048, 00:27:18.665 "data_size": 63488 00:27:18.665 }, 00:27:18.665 { 00:27:18.665 "name": "BaseBdev3", 00:27:18.665 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:18.665 "is_configured": true, 00:27:18.665 "data_offset": 2048, 00:27:18.665 "data_size": 63488 00:27:18.665 }, 00:27:18.665 { 00:27:18.665 "name": "BaseBdev4", 00:27:18.665 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:18.665 "is_configured": true, 00:27:18.665 "data_offset": 2048, 00:27:18.665 "data_size": 63488 00:27:18.665 } 00:27:18.665 ] 00:27:18.665 }' 00:27:18.665 05:55:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:18.665 05:55:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:19.236 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:19.496 [2024-07-26 05:55:34.234893] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:19.496 [2024-07-26 05:55:34.234922] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:19.496 [2024-07-26 05:55:34.234994] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:19.496 [2024-07-26 05:55:34.235064] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:19.496 [2024-07-26 05:55:34.235077] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x184d8a0 name raid_bdev1, state offline 00:27:19.496 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.496 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:19.756 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:20.015 /dev/nbd0 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:20.015 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:20.015 1+0 records in 00:27:20.015 1+0 records out 00:27:20.015 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227429 s, 18.0 MB/s 00:27:20.016 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:20.016 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:27:20.016 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:20.016 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:20.016 05:55:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:27:20.016 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:20.016 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:20.016 05:55:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:20.275 /dev/nbd1 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:20.275 1+0 records in 00:27:20.275 1+0 records out 00:27:20.275 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299307 s, 13.7 MB/s 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:20.275 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:20.535 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:20.794 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:21.053 05:55:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:21.313 [2024-07-26 05:55:36.099394] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:21.313 [2024-07-26 05:55:36.099446] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:21.313 [2024-07-26 05:55:36.099470] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18c7630 00:27:21.313 [2024-07-26 05:55:36.099482] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:21.313 [2024-07-26 05:55:36.101159] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:21.313 [2024-07-26 05:55:36.101189] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:21.313 [2024-07-26 05:55:36.101277] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:21.313 [2024-07-26 05:55:36.101305] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:21.313 [2024-07-26 05:55:36.101413] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:21.313 [2024-07-26 05:55:36.101488] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:21.313 spare 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.313 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.313 [2024-07-26 05:55:36.201807] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1713b50 00:27:21.313 [2024-07-26 05:55:36.201825] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:21.313 [2024-07-26 05:55:36.202030] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18cdeb0 00:27:21.313 [2024-07-26 05:55:36.202184] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1713b50 00:27:21.313 [2024-07-26 05:55:36.202195] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1713b50 00:27:21.313 [2024-07-26 05:55:36.202304] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:21.574 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.574 "name": "raid_bdev1", 00:27:21.574 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:21.574 "strip_size_kb": 0, 00:27:21.574 "state": "online", 00:27:21.574 "raid_level": "raid1", 00:27:21.574 "superblock": true, 00:27:21.574 "num_base_bdevs": 4, 00:27:21.574 "num_base_bdevs_discovered": 3, 00:27:21.574 "num_base_bdevs_operational": 3, 00:27:21.574 "base_bdevs_list": [ 00:27:21.574 { 00:27:21.574 "name": "spare", 00:27:21.574 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:21.574 "is_configured": true, 00:27:21.574 "data_offset": 2048, 00:27:21.574 "data_size": 63488 00:27:21.574 }, 00:27:21.574 { 00:27:21.574 "name": null, 00:27:21.574 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.574 "is_configured": false, 00:27:21.574 "data_offset": 2048, 00:27:21.574 "data_size": 63488 00:27:21.574 }, 00:27:21.574 { 00:27:21.574 "name": "BaseBdev3", 00:27:21.574 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:21.574 "is_configured": true, 00:27:21.574 "data_offset": 2048, 00:27:21.574 "data_size": 63488 00:27:21.574 }, 00:27:21.574 { 00:27:21.574 "name": "BaseBdev4", 00:27:21.574 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:21.574 "is_configured": true, 00:27:21.574 "data_offset": 2048, 00:27:21.574 "data_size": 63488 00:27:21.574 } 00:27:21.574 ] 00:27:21.574 }' 00:27:21.574 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.574 05:55:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:22.142 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:22.142 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:22.142 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:22.142 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:22.142 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:22.142 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.142 05:55:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:22.401 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:22.401 "name": "raid_bdev1", 00:27:22.401 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:22.401 "strip_size_kb": 0, 00:27:22.401 "state": "online", 00:27:22.401 "raid_level": "raid1", 00:27:22.401 "superblock": true, 00:27:22.401 "num_base_bdevs": 4, 00:27:22.401 "num_base_bdevs_discovered": 3, 00:27:22.401 "num_base_bdevs_operational": 3, 00:27:22.401 "base_bdevs_list": [ 00:27:22.401 { 00:27:22.401 "name": "spare", 00:27:22.401 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:22.401 "is_configured": true, 00:27:22.401 "data_offset": 2048, 00:27:22.401 "data_size": 63488 00:27:22.401 }, 00:27:22.401 { 00:27:22.401 "name": null, 00:27:22.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:22.401 "is_configured": false, 00:27:22.401 "data_offset": 2048, 00:27:22.401 "data_size": 63488 00:27:22.401 }, 00:27:22.401 { 00:27:22.402 "name": "BaseBdev3", 00:27:22.402 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:22.402 "is_configured": true, 00:27:22.402 "data_offset": 2048, 00:27:22.402 "data_size": 63488 00:27:22.402 }, 00:27:22.402 { 00:27:22.402 "name": "BaseBdev4", 00:27:22.402 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:22.402 "is_configured": true, 00:27:22.402 "data_offset": 2048, 00:27:22.402 "data_size": 63488 00:27:22.402 } 00:27:22.402 ] 00:27:22.402 }' 00:27:22.402 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:22.402 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:22.402 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:22.660 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:22.660 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.660 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:22.660 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:22.660 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:22.918 [2024-07-26 05:55:37.787972] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.919 05:55:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.177 05:55:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.177 "name": "raid_bdev1", 00:27:23.177 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:23.177 "strip_size_kb": 0, 00:27:23.177 "state": "online", 00:27:23.177 "raid_level": "raid1", 00:27:23.177 "superblock": true, 00:27:23.177 "num_base_bdevs": 4, 00:27:23.177 "num_base_bdevs_discovered": 2, 00:27:23.177 "num_base_bdevs_operational": 2, 00:27:23.177 "base_bdevs_list": [ 00:27:23.177 { 00:27:23.177 "name": null, 00:27:23.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.177 "is_configured": false, 00:27:23.177 "data_offset": 2048, 00:27:23.178 "data_size": 63488 00:27:23.178 }, 00:27:23.178 { 00:27:23.178 "name": null, 00:27:23.178 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.178 "is_configured": false, 00:27:23.178 "data_offset": 2048, 00:27:23.178 "data_size": 63488 00:27:23.178 }, 00:27:23.178 { 00:27:23.178 "name": "BaseBdev3", 00:27:23.178 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:23.178 "is_configured": true, 00:27:23.178 "data_offset": 2048, 00:27:23.178 "data_size": 63488 00:27:23.178 }, 00:27:23.178 { 00:27:23.178 "name": "BaseBdev4", 00:27:23.178 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:23.178 "is_configured": true, 00:27:23.178 "data_offset": 2048, 00:27:23.178 "data_size": 63488 00:27:23.178 } 00:27:23.178 ] 00:27:23.178 }' 00:27:23.178 05:55:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.178 05:55:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:24.114 05:55:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:24.114 [2024-07-26 05:55:38.894910] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:24.114 [2024-07-26 05:55:38.895069] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:24.114 [2024-07-26 05:55:38.895087] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:24.114 [2024-07-26 05:55:38.895114] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:24.114 [2024-07-26 05:55:38.899064] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1851930 00:27:24.114 [2024-07-26 05:55:38.901468] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:24.115 05:55:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:25.051 05:55:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:25.051 05:55:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:25.051 05:55:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:25.051 05:55:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:25.051 05:55:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:25.051 05:55:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.051 05:55:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:25.309 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:25.309 "name": "raid_bdev1", 00:27:25.309 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:25.309 "strip_size_kb": 0, 00:27:25.309 "state": "online", 00:27:25.309 "raid_level": "raid1", 00:27:25.309 "superblock": true, 00:27:25.309 "num_base_bdevs": 4, 00:27:25.309 "num_base_bdevs_discovered": 3, 00:27:25.309 "num_base_bdevs_operational": 3, 00:27:25.309 "process": { 00:27:25.309 "type": "rebuild", 00:27:25.309 "target": "spare", 00:27:25.309 "progress": { 00:27:25.309 "blocks": 24576, 00:27:25.309 "percent": 38 00:27:25.309 } 00:27:25.309 }, 00:27:25.309 "base_bdevs_list": [ 00:27:25.309 { 00:27:25.309 "name": "spare", 00:27:25.309 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:25.309 "is_configured": true, 00:27:25.309 "data_offset": 2048, 00:27:25.309 "data_size": 63488 00:27:25.309 }, 00:27:25.309 { 00:27:25.309 "name": null, 00:27:25.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.309 "is_configured": false, 00:27:25.309 "data_offset": 2048, 00:27:25.309 "data_size": 63488 00:27:25.309 }, 00:27:25.309 { 00:27:25.309 "name": "BaseBdev3", 00:27:25.309 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:25.309 "is_configured": true, 00:27:25.309 "data_offset": 2048, 00:27:25.309 "data_size": 63488 00:27:25.309 }, 00:27:25.309 { 00:27:25.309 "name": "BaseBdev4", 00:27:25.309 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:25.309 "is_configured": true, 00:27:25.309 "data_offset": 2048, 00:27:25.309 "data_size": 63488 00:27:25.309 } 00:27:25.309 ] 00:27:25.309 }' 00:27:25.309 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:25.567 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:25.567 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:25.567 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:25.567 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:25.825 [2024-07-26 05:55:40.480775] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:25.825 [2024-07-26 05:55:40.514175] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:25.825 [2024-07-26 05:55:40.514219] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:25.825 [2024-07-26 05:55:40.514235] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:25.825 [2024-07-26 05:55:40.514243] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:25.825 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:25.825 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:25.825 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:25.826 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:25.826 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:25.826 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:25.826 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:25.826 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:25.826 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:25.826 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:25.826 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.826 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.084 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.084 "name": "raid_bdev1", 00:27:26.084 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:26.084 "strip_size_kb": 0, 00:27:26.084 "state": "online", 00:27:26.084 "raid_level": "raid1", 00:27:26.084 "superblock": true, 00:27:26.084 "num_base_bdevs": 4, 00:27:26.084 "num_base_bdevs_discovered": 2, 00:27:26.084 "num_base_bdevs_operational": 2, 00:27:26.084 "base_bdevs_list": [ 00:27:26.084 { 00:27:26.084 "name": null, 00:27:26.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.084 "is_configured": false, 00:27:26.084 "data_offset": 2048, 00:27:26.084 "data_size": 63488 00:27:26.084 }, 00:27:26.084 { 00:27:26.084 "name": null, 00:27:26.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.084 "is_configured": false, 00:27:26.084 "data_offset": 2048, 00:27:26.084 "data_size": 63488 00:27:26.084 }, 00:27:26.084 { 00:27:26.084 "name": "BaseBdev3", 00:27:26.084 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:26.084 "is_configured": true, 00:27:26.084 "data_offset": 2048, 00:27:26.084 "data_size": 63488 00:27:26.084 }, 00:27:26.084 { 00:27:26.084 "name": "BaseBdev4", 00:27:26.084 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:26.084 "is_configured": true, 00:27:26.084 "data_offset": 2048, 00:27:26.084 "data_size": 63488 00:27:26.084 } 00:27:26.084 ] 00:27:26.084 }' 00:27:26.084 05:55:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.084 05:55:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:26.651 05:55:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:26.651 [2024-07-26 05:55:41.540922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:26.651 [2024-07-26 05:55:41.540975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:26.651 [2024-07-26 05:55:41.540998] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1850040 00:27:26.651 [2024-07-26 05:55:41.541011] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:26.651 [2024-07-26 05:55:41.541403] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:26.651 [2024-07-26 05:55:41.541421] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:26.651 [2024-07-26 05:55:41.541505] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:26.651 [2024-07-26 05:55:41.541518] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:26.651 [2024-07-26 05:55:41.541529] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:26.651 [2024-07-26 05:55:41.541549] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:26.651 [2024-07-26 05:55:41.545510] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1851e80 00:27:26.651 spare 00:27:26.651 [2024-07-26 05:55:41.546938] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:26.909 05:55:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:27.885 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:27.885 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:27.885 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:27.885 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:27.885 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:27.885 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.885 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.885 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:27.885 "name": "raid_bdev1", 00:27:27.885 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:27.885 "strip_size_kb": 0, 00:27:27.885 "state": "online", 00:27:27.885 "raid_level": "raid1", 00:27:27.885 "superblock": true, 00:27:27.885 "num_base_bdevs": 4, 00:27:27.885 "num_base_bdevs_discovered": 3, 00:27:27.885 "num_base_bdevs_operational": 3, 00:27:27.885 "process": { 00:27:27.885 "type": "rebuild", 00:27:27.885 "target": "spare", 00:27:27.885 "progress": { 00:27:27.885 "blocks": 22528, 00:27:27.885 "percent": 35 00:27:27.885 } 00:27:27.885 }, 00:27:27.885 "base_bdevs_list": [ 00:27:27.885 { 00:27:27.885 "name": "spare", 00:27:27.885 "uuid": "36f9d266-0303-58ad-99a2-9415ebfd1ee5", 00:27:27.885 "is_configured": true, 00:27:27.885 "data_offset": 2048, 00:27:27.885 "data_size": 63488 00:27:27.885 }, 00:27:27.885 { 00:27:27.885 "name": null, 00:27:27.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.886 "is_configured": false, 00:27:27.886 "data_offset": 2048, 00:27:27.886 "data_size": 63488 00:27:27.886 }, 00:27:27.886 { 00:27:27.886 "name": "BaseBdev3", 00:27:27.886 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:27.886 "is_configured": true, 00:27:27.886 "data_offset": 2048, 00:27:27.886 "data_size": 63488 00:27:27.886 }, 00:27:27.886 { 00:27:27.886 "name": "BaseBdev4", 00:27:27.886 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:27.886 "is_configured": true, 00:27:27.886 "data_offset": 2048, 00:27:27.886 "data_size": 63488 00:27:27.886 } 00:27:27.886 ] 00:27:27.886 }' 00:27:27.886 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:27.886 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:28.144 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:28.144 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:28.144 05:55:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:28.403 [2024-07-26 05:55:43.062514] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:28.403 [2024-07-26 05:55:43.159200] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:28.403 [2024-07-26 05:55:43.159247] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:28.403 [2024-07-26 05:55:43.159264] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:28.403 [2024-07-26 05:55:43.159272] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.403 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:28.661 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:28.661 "name": "raid_bdev1", 00:27:28.661 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:28.661 "strip_size_kb": 0, 00:27:28.661 "state": "online", 00:27:28.661 "raid_level": "raid1", 00:27:28.661 "superblock": true, 00:27:28.661 "num_base_bdevs": 4, 00:27:28.661 "num_base_bdevs_discovered": 2, 00:27:28.661 "num_base_bdevs_operational": 2, 00:27:28.661 "base_bdevs_list": [ 00:27:28.661 { 00:27:28.661 "name": null, 00:27:28.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:28.661 "is_configured": false, 00:27:28.661 "data_offset": 2048, 00:27:28.661 "data_size": 63488 00:27:28.661 }, 00:27:28.661 { 00:27:28.661 "name": null, 00:27:28.661 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:28.661 "is_configured": false, 00:27:28.661 "data_offset": 2048, 00:27:28.661 "data_size": 63488 00:27:28.661 }, 00:27:28.661 { 00:27:28.661 "name": "BaseBdev3", 00:27:28.661 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:28.661 "is_configured": true, 00:27:28.661 "data_offset": 2048, 00:27:28.661 "data_size": 63488 00:27:28.661 }, 00:27:28.661 { 00:27:28.661 "name": "BaseBdev4", 00:27:28.661 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:28.661 "is_configured": true, 00:27:28.661 "data_offset": 2048, 00:27:28.661 "data_size": 63488 00:27:28.661 } 00:27:28.661 ] 00:27:28.661 }' 00:27:28.661 05:55:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:28.661 05:55:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:29.228 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:29.228 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:29.228 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:29.228 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:29.228 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:29.228 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.228 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.487 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:29.487 "name": "raid_bdev1", 00:27:29.487 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:29.487 "strip_size_kb": 0, 00:27:29.487 "state": "online", 00:27:29.487 "raid_level": "raid1", 00:27:29.487 "superblock": true, 00:27:29.487 "num_base_bdevs": 4, 00:27:29.487 "num_base_bdevs_discovered": 2, 00:27:29.487 "num_base_bdevs_operational": 2, 00:27:29.487 "base_bdevs_list": [ 00:27:29.488 { 00:27:29.488 "name": null, 00:27:29.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.488 "is_configured": false, 00:27:29.488 "data_offset": 2048, 00:27:29.488 "data_size": 63488 00:27:29.488 }, 00:27:29.488 { 00:27:29.488 "name": null, 00:27:29.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.488 "is_configured": false, 00:27:29.488 "data_offset": 2048, 00:27:29.488 "data_size": 63488 00:27:29.488 }, 00:27:29.488 { 00:27:29.488 "name": "BaseBdev3", 00:27:29.488 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:29.488 "is_configured": true, 00:27:29.488 "data_offset": 2048, 00:27:29.488 "data_size": 63488 00:27:29.488 }, 00:27:29.488 { 00:27:29.488 "name": "BaseBdev4", 00:27:29.488 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:29.488 "is_configured": true, 00:27:29.488 "data_offset": 2048, 00:27:29.488 "data_size": 63488 00:27:29.488 } 00:27:29.488 ] 00:27:29.488 }' 00:27:29.488 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:29.488 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:29.488 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:29.488 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:29.488 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:29.747 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:29.747 [2024-07-26 05:55:44.611360] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:29.747 [2024-07-26 05:55:44.611410] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:29.747 [2024-07-26 05:55:44.611432] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17138b0 00:27:29.747 [2024-07-26 05:55:44.611444] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:29.747 [2024-07-26 05:55:44.611814] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:29.747 [2024-07-26 05:55:44.611834] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:29.747 [2024-07-26 05:55:44.611902] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:29.747 [2024-07-26 05:55:44.611914] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:29.747 [2024-07-26 05:55:44.611926] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:29.747 BaseBdev1 00:27:29.747 05:55:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:31.124 "name": "raid_bdev1", 00:27:31.124 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:31.124 "strip_size_kb": 0, 00:27:31.124 "state": "online", 00:27:31.124 "raid_level": "raid1", 00:27:31.124 "superblock": true, 00:27:31.124 "num_base_bdevs": 4, 00:27:31.124 "num_base_bdevs_discovered": 2, 00:27:31.124 "num_base_bdevs_operational": 2, 00:27:31.124 "base_bdevs_list": [ 00:27:31.124 { 00:27:31.124 "name": null, 00:27:31.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.124 "is_configured": false, 00:27:31.124 "data_offset": 2048, 00:27:31.124 "data_size": 63488 00:27:31.124 }, 00:27:31.124 { 00:27:31.124 "name": null, 00:27:31.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.124 "is_configured": false, 00:27:31.124 "data_offset": 2048, 00:27:31.124 "data_size": 63488 00:27:31.124 }, 00:27:31.124 { 00:27:31.124 "name": "BaseBdev3", 00:27:31.124 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:31.124 "is_configured": true, 00:27:31.124 "data_offset": 2048, 00:27:31.124 "data_size": 63488 00:27:31.124 }, 00:27:31.124 { 00:27:31.124 "name": "BaseBdev4", 00:27:31.124 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:31.124 "is_configured": true, 00:27:31.124 "data_offset": 2048, 00:27:31.124 "data_size": 63488 00:27:31.124 } 00:27:31.124 ] 00:27:31.124 }' 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:31.124 05:55:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:31.692 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:31.692 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:31.692 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:31.692 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:31.692 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:31.692 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.692 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.692 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:31.692 "name": "raid_bdev1", 00:27:31.692 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:31.692 "strip_size_kb": 0, 00:27:31.692 "state": "online", 00:27:31.692 "raid_level": "raid1", 00:27:31.692 "superblock": true, 00:27:31.692 "num_base_bdevs": 4, 00:27:31.692 "num_base_bdevs_discovered": 2, 00:27:31.692 "num_base_bdevs_operational": 2, 00:27:31.692 "base_bdevs_list": [ 00:27:31.692 { 00:27:31.692 "name": null, 00:27:31.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.692 "is_configured": false, 00:27:31.692 "data_offset": 2048, 00:27:31.692 "data_size": 63488 00:27:31.692 }, 00:27:31.692 { 00:27:31.692 "name": null, 00:27:31.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.692 "is_configured": false, 00:27:31.692 "data_offset": 2048, 00:27:31.692 "data_size": 63488 00:27:31.692 }, 00:27:31.692 { 00:27:31.692 "name": "BaseBdev3", 00:27:31.692 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:31.692 "is_configured": true, 00:27:31.692 "data_offset": 2048, 00:27:31.692 "data_size": 63488 00:27:31.692 }, 00:27:31.692 { 00:27:31.692 "name": "BaseBdev4", 00:27:31.692 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:31.692 "is_configured": true, 00:27:31.692 "data_offset": 2048, 00:27:31.692 "data_size": 63488 00:27:31.692 } 00:27:31.692 ] 00:27:31.692 }' 00:27:31.692 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:31.950 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:31.950 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:31.950 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:31.950 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:31.950 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:27:31.950 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:31.950 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:31.950 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:31.951 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:31.951 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:31.951 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:31.951 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:31.951 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:31.951 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:31.951 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:32.210 [2024-07-26 05:55:46.901438] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:32.210 [2024-07-26 05:55:46.901572] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:32.210 [2024-07-26 05:55:46.901588] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:32.210 request: 00:27:32.210 { 00:27:32.210 "base_bdev": "BaseBdev1", 00:27:32.210 "raid_bdev": "raid_bdev1", 00:27:32.210 "method": "bdev_raid_add_base_bdev", 00:27:32.210 "req_id": 1 00:27:32.210 } 00:27:32.210 Got JSON-RPC error response 00:27:32.210 response: 00:27:32.210 { 00:27:32.210 "code": -22, 00:27:32.210 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:32.210 } 00:27:32.210 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:27:32.210 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:32.210 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:32.210 05:55:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:32.210 05:55:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.147 05:55:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:33.406 05:55:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:33.406 "name": "raid_bdev1", 00:27:33.406 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:33.406 "strip_size_kb": 0, 00:27:33.406 "state": "online", 00:27:33.406 "raid_level": "raid1", 00:27:33.406 "superblock": true, 00:27:33.406 "num_base_bdevs": 4, 00:27:33.406 "num_base_bdevs_discovered": 2, 00:27:33.406 "num_base_bdevs_operational": 2, 00:27:33.406 "base_bdevs_list": [ 00:27:33.406 { 00:27:33.406 "name": null, 00:27:33.406 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:33.406 "is_configured": false, 00:27:33.406 "data_offset": 2048, 00:27:33.406 "data_size": 63488 00:27:33.406 }, 00:27:33.406 { 00:27:33.406 "name": null, 00:27:33.406 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:33.406 "is_configured": false, 00:27:33.406 "data_offset": 2048, 00:27:33.406 "data_size": 63488 00:27:33.406 }, 00:27:33.406 { 00:27:33.406 "name": "BaseBdev3", 00:27:33.406 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:33.406 "is_configured": true, 00:27:33.406 "data_offset": 2048, 00:27:33.406 "data_size": 63488 00:27:33.406 }, 00:27:33.406 { 00:27:33.406 "name": "BaseBdev4", 00:27:33.406 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:33.406 "is_configured": true, 00:27:33.406 "data_offset": 2048, 00:27:33.406 "data_size": 63488 00:27:33.406 } 00:27:33.406 ] 00:27:33.406 }' 00:27:33.406 05:55:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:33.406 05:55:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:33.974 05:55:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:33.974 05:55:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:33.974 05:55:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:33.975 05:55:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:33.975 05:55:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:33.975 05:55:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.975 05:55:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:34.234 "name": "raid_bdev1", 00:27:34.234 "uuid": "64a3c3bc-32df-4b2b-8454-6e317dc051db", 00:27:34.234 "strip_size_kb": 0, 00:27:34.234 "state": "online", 00:27:34.234 "raid_level": "raid1", 00:27:34.234 "superblock": true, 00:27:34.234 "num_base_bdevs": 4, 00:27:34.234 "num_base_bdevs_discovered": 2, 00:27:34.234 "num_base_bdevs_operational": 2, 00:27:34.234 "base_bdevs_list": [ 00:27:34.234 { 00:27:34.234 "name": null, 00:27:34.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.234 "is_configured": false, 00:27:34.234 "data_offset": 2048, 00:27:34.234 "data_size": 63488 00:27:34.234 }, 00:27:34.234 { 00:27:34.234 "name": null, 00:27:34.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.234 "is_configured": false, 00:27:34.234 "data_offset": 2048, 00:27:34.234 "data_size": 63488 00:27:34.234 }, 00:27:34.234 { 00:27:34.234 "name": "BaseBdev3", 00:27:34.234 "uuid": "bccec0c7-d912-50fd-a60d-a62e621eb4c5", 00:27:34.234 "is_configured": true, 00:27:34.234 "data_offset": 2048, 00:27:34.234 "data_size": 63488 00:27:34.234 }, 00:27:34.234 { 00:27:34.234 "name": "BaseBdev4", 00:27:34.234 "uuid": "a566fd49-6e38-5881-86b4-35fee65f59c4", 00:27:34.234 "is_configured": true, 00:27:34.234 "data_offset": 2048, 00:27:34.234 "data_size": 63488 00:27:34.234 } 00:27:34.234 ] 00:27:34.234 }' 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1252540 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1252540 ']' 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1252540 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:34.234 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1252540 00:27:34.493 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:34.493 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:34.493 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1252540' 00:27:34.493 killing process with pid 1252540 00:27:34.493 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1252540 00:27:34.493 Received shutdown signal, test time was about 60.000000 seconds 00:27:34.493 00:27:34.493 Latency(us) 00:27:34.493 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:34.493 =================================================================================================================== 00:27:34.493 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:34.493 [2024-07-26 05:55:49.162148] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:34.493 [2024-07-26 05:55:49.162244] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:34.493 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1252540 00:27:34.493 [2024-07-26 05:55:49.162304] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:34.493 [2024-07-26 05:55:49.162317] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1713b50 name raid_bdev1, state offline 00:27:34.493 [2024-07-26 05:55:49.209223] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:27:34.753 00:27:34.753 real 0m37.780s 00:27:34.753 user 0m55.587s 00:27:34.753 sys 0m6.705s 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:34.753 ************************************ 00:27:34.753 END TEST raid_rebuild_test_sb 00:27:34.753 ************************************ 00:27:34.753 05:55:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:34.753 05:55:49 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:27:34.753 05:55:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:34.753 05:55:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:34.753 05:55:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:34.753 ************************************ 00:27:34.753 START TEST raid_rebuild_test_io 00:27:34.753 ************************************ 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1257864 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1257864 /var/tmp/spdk-raid.sock 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1257864 ']' 00:27:34.753 05:55:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:34.754 05:55:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:34.754 05:55:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:34.754 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:34.754 05:55:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:34.754 05:55:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:34.754 [2024-07-26 05:55:49.567032] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:27:34.754 [2024-07-26 05:55:49.567097] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1257864 ] 00:27:34.754 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:34.754 Zero copy mechanism will not be used. 00:27:35.013 [2024-07-26 05:55:49.687178] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:35.013 [2024-07-26 05:55:49.793061] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:35.013 [2024-07-26 05:55:49.855906] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:35.013 [2024-07-26 05:55:49.855936] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:35.656 05:55:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:35.656 05:55:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:27:35.656 05:55:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:35.656 05:55:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:35.917 BaseBdev1_malloc 00:27:35.917 05:55:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:36.176 [2024-07-26 05:55:50.980590] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:36.176 [2024-07-26 05:55:50.980644] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:36.176 [2024-07-26 05:55:50.980670] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x162ad40 00:27:36.176 [2024-07-26 05:55:50.980682] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:36.176 [2024-07-26 05:55:50.982407] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:36.176 [2024-07-26 05:55:50.982435] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:36.176 BaseBdev1 00:27:36.176 05:55:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:36.176 05:55:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:36.435 BaseBdev2_malloc 00:27:36.435 05:55:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:36.695 [2024-07-26 05:55:51.462756] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:36.695 [2024-07-26 05:55:51.462803] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:36.695 [2024-07-26 05:55:51.462830] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x162b860 00:27:36.695 [2024-07-26 05:55:51.462844] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:36.695 [2024-07-26 05:55:51.464430] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:36.695 [2024-07-26 05:55:51.464460] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:36.695 BaseBdev2 00:27:36.695 05:55:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:36.695 05:55:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:36.954 BaseBdev3_malloc 00:27:36.954 05:55:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:27:37.213 [2024-07-26 05:55:51.949904] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:27:37.213 [2024-07-26 05:55:51.949951] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:37.213 [2024-07-26 05:55:51.949973] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17d88f0 00:27:37.213 [2024-07-26 05:55:51.949986] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:37.213 [2024-07-26 05:55:51.951522] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:37.213 [2024-07-26 05:55:51.951549] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:37.213 BaseBdev3 00:27:37.213 05:55:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:37.213 05:55:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:37.473 BaseBdev4_malloc 00:27:37.473 05:55:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:27:37.732 [2024-07-26 05:55:52.440993] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:27:37.732 [2024-07-26 05:55:52.441039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:37.732 [2024-07-26 05:55:52.441059] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17d7ad0 00:27:37.732 [2024-07-26 05:55:52.441072] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:37.732 [2024-07-26 05:55:52.442611] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:37.732 [2024-07-26 05:55:52.442646] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:37.732 BaseBdev4 00:27:37.732 05:55:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:37.990 spare_malloc 00:27:37.990 05:55:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:38.248 spare_delay 00:27:38.248 05:55:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:38.508 [2024-07-26 05:55:53.167475] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:38.508 [2024-07-26 05:55:53.167527] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:38.508 [2024-07-26 05:55:53.167549] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17dc5b0 00:27:38.508 [2024-07-26 05:55:53.167561] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:38.508 [2024-07-26 05:55:53.169144] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:38.508 [2024-07-26 05:55:53.169172] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:38.508 spare 00:27:38.508 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:38.508 [2024-07-26 05:55:53.412145] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:38.508 [2024-07-26 05:55:53.413497] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:38.508 [2024-07-26 05:55:53.413552] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:38.508 [2024-07-26 05:55:53.413597] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:38.508 [2024-07-26 05:55:53.413685] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x175b8a0 00:27:38.508 [2024-07-26 05:55:53.413696] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:27:38.508 [2024-07-26 05:55:53.413914] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17d5e10 00:27:38.508 [2024-07-26 05:55:53.414068] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x175b8a0 00:27:38.508 [2024-07-26 05:55:53.414079] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x175b8a0 00:27:38.508 [2024-07-26 05:55:53.414191] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.766 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:38.766 "name": "raid_bdev1", 00:27:38.766 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:38.766 "strip_size_kb": 0, 00:27:38.766 "state": "online", 00:27:38.766 "raid_level": "raid1", 00:27:38.766 "superblock": false, 00:27:38.767 "num_base_bdevs": 4, 00:27:38.767 "num_base_bdevs_discovered": 4, 00:27:38.767 "num_base_bdevs_operational": 4, 00:27:38.767 "base_bdevs_list": [ 00:27:38.767 { 00:27:38.767 "name": "BaseBdev1", 00:27:38.767 "uuid": "05e600b4-719c-552d-9a2d-a543a163efff", 00:27:38.767 "is_configured": true, 00:27:38.767 "data_offset": 0, 00:27:38.767 "data_size": 65536 00:27:38.767 }, 00:27:38.767 { 00:27:38.767 "name": "BaseBdev2", 00:27:38.767 "uuid": "a05600ee-ebec-5da6-8628-e01e72cd0caf", 00:27:38.767 "is_configured": true, 00:27:38.767 "data_offset": 0, 00:27:38.767 "data_size": 65536 00:27:38.767 }, 00:27:38.767 { 00:27:38.767 "name": "BaseBdev3", 00:27:38.767 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:38.767 "is_configured": true, 00:27:38.767 "data_offset": 0, 00:27:38.767 "data_size": 65536 00:27:38.767 }, 00:27:38.767 { 00:27:38.767 "name": "BaseBdev4", 00:27:38.767 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:38.767 "is_configured": true, 00:27:38.767 "data_offset": 0, 00:27:38.767 "data_size": 65536 00:27:38.767 } 00:27:38.767 ] 00:27:38.767 }' 00:27:38.767 05:55:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:38.767 05:55:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:39.704 05:55:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:39.704 05:55:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:39.704 [2024-07-26 05:55:54.491284] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:39.704 05:55:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:27:39.704 05:55:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.704 05:55:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:39.963 05:55:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:27:39.963 05:55:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:27:39.963 05:55:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:39.963 05:55:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:40.222 [2024-07-26 05:55:54.878095] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1761970 00:27:40.222 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:40.222 Zero copy mechanism will not be used. 00:27:40.222 Running I/O for 60 seconds... 00:27:40.222 [2024-07-26 05:55:54.996657] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:40.222 [2024-07-26 05:55:55.012843] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1761970 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.222 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.481 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:40.481 "name": "raid_bdev1", 00:27:40.481 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:40.481 "strip_size_kb": 0, 00:27:40.481 "state": "online", 00:27:40.481 "raid_level": "raid1", 00:27:40.481 "superblock": false, 00:27:40.481 "num_base_bdevs": 4, 00:27:40.481 "num_base_bdevs_discovered": 3, 00:27:40.481 "num_base_bdevs_operational": 3, 00:27:40.481 "base_bdevs_list": [ 00:27:40.482 { 00:27:40.482 "name": null, 00:27:40.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:40.482 "is_configured": false, 00:27:40.482 "data_offset": 0, 00:27:40.482 "data_size": 65536 00:27:40.482 }, 00:27:40.482 { 00:27:40.482 "name": "BaseBdev2", 00:27:40.482 "uuid": "a05600ee-ebec-5da6-8628-e01e72cd0caf", 00:27:40.482 "is_configured": true, 00:27:40.482 "data_offset": 0, 00:27:40.482 "data_size": 65536 00:27:40.482 }, 00:27:40.482 { 00:27:40.482 "name": "BaseBdev3", 00:27:40.482 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:40.482 "is_configured": true, 00:27:40.482 "data_offset": 0, 00:27:40.482 "data_size": 65536 00:27:40.482 }, 00:27:40.482 { 00:27:40.482 "name": "BaseBdev4", 00:27:40.482 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:40.482 "is_configured": true, 00:27:40.482 "data_offset": 0, 00:27:40.482 "data_size": 65536 00:27:40.482 } 00:27:40.482 ] 00:27:40.482 }' 00:27:40.482 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:40.482 05:55:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:41.050 05:55:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:41.309 [2024-07-26 05:55:56.135068] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:41.309 [2024-07-26 05:55:56.154431] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1331fa0 00:27:41.309 [2024-07-26 05:55:56.156831] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:41.309 05:55:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:41.568 [2024-07-26 05:55:56.287183] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:41.568 [2024-07-26 05:55:56.287499] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:41.827 [2024-07-26 05:55:56.510292] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:41.827 [2024-07-26 05:55:56.510949] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:42.086 [2024-07-26 05:55:56.827075] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:42.345 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:42.345 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:42.345 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:42.345 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:42.345 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:42.345 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.345 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.605 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:42.605 "name": "raid_bdev1", 00:27:42.605 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:42.605 "strip_size_kb": 0, 00:27:42.605 "state": "online", 00:27:42.605 "raid_level": "raid1", 00:27:42.605 "superblock": false, 00:27:42.605 "num_base_bdevs": 4, 00:27:42.605 "num_base_bdevs_discovered": 4, 00:27:42.605 "num_base_bdevs_operational": 4, 00:27:42.605 "process": { 00:27:42.605 "type": "rebuild", 00:27:42.605 "target": "spare", 00:27:42.605 "progress": { 00:27:42.605 "blocks": 14336, 00:27:42.605 "percent": 21 00:27:42.605 } 00:27:42.605 }, 00:27:42.605 "base_bdevs_list": [ 00:27:42.605 { 00:27:42.605 "name": "spare", 00:27:42.605 "uuid": "08644430-07c6-52ef-a74d-611bc1d2a3a7", 00:27:42.605 "is_configured": true, 00:27:42.605 "data_offset": 0, 00:27:42.605 "data_size": 65536 00:27:42.605 }, 00:27:42.605 { 00:27:42.605 "name": "BaseBdev2", 00:27:42.605 "uuid": "a05600ee-ebec-5da6-8628-e01e72cd0caf", 00:27:42.605 "is_configured": true, 00:27:42.605 "data_offset": 0, 00:27:42.605 "data_size": 65536 00:27:42.605 }, 00:27:42.605 { 00:27:42.605 "name": "BaseBdev3", 00:27:42.605 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:42.605 "is_configured": true, 00:27:42.605 "data_offset": 0, 00:27:42.605 "data_size": 65536 00:27:42.605 }, 00:27:42.605 { 00:27:42.605 "name": "BaseBdev4", 00:27:42.605 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:42.605 "is_configured": true, 00:27:42.605 "data_offset": 0, 00:27:42.605 "data_size": 65536 00:27:42.605 } 00:27:42.605 ] 00:27:42.605 }' 00:27:42.605 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:42.605 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:42.605 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:42.864 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:42.864 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:42.864 [2024-07-26 05:55:57.668803] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:27:42.864 [2024-07-26 05:55:57.748736] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:43.123 [2024-07-26 05:55:57.781225] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:27:43.123 [2024-07-26 05:55:57.781538] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:27:43.123 [2024-07-26 05:55:57.798345] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:43.123 [2024-07-26 05:55:57.811438] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:43.123 [2024-07-26 05:55:57.811472] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:43.123 [2024-07-26 05:55:57.811483] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:43.123 [2024-07-26 05:55:57.843621] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1761970 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.123 05:55:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.383 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.383 "name": "raid_bdev1", 00:27:43.383 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:43.383 "strip_size_kb": 0, 00:27:43.383 "state": "online", 00:27:43.383 "raid_level": "raid1", 00:27:43.383 "superblock": false, 00:27:43.383 "num_base_bdevs": 4, 00:27:43.383 "num_base_bdevs_discovered": 3, 00:27:43.383 "num_base_bdevs_operational": 3, 00:27:43.383 "base_bdevs_list": [ 00:27:43.383 { 00:27:43.383 "name": null, 00:27:43.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.383 "is_configured": false, 00:27:43.383 "data_offset": 0, 00:27:43.383 "data_size": 65536 00:27:43.383 }, 00:27:43.383 { 00:27:43.383 "name": "BaseBdev2", 00:27:43.383 "uuid": "a05600ee-ebec-5da6-8628-e01e72cd0caf", 00:27:43.383 "is_configured": true, 00:27:43.383 "data_offset": 0, 00:27:43.383 "data_size": 65536 00:27:43.383 }, 00:27:43.383 { 00:27:43.383 "name": "BaseBdev3", 00:27:43.383 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:43.383 "is_configured": true, 00:27:43.383 "data_offset": 0, 00:27:43.383 "data_size": 65536 00:27:43.383 }, 00:27:43.383 { 00:27:43.383 "name": "BaseBdev4", 00:27:43.383 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:43.383 "is_configured": true, 00:27:43.383 "data_offset": 0, 00:27:43.383 "data_size": 65536 00:27:43.383 } 00:27:43.383 ] 00:27:43.383 }' 00:27:43.383 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.383 05:55:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:43.951 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:43.951 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:43.951 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:43.951 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:43.951 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:43.951 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.951 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.210 05:55:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:44.210 "name": "raid_bdev1", 00:27:44.210 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:44.210 "strip_size_kb": 0, 00:27:44.210 "state": "online", 00:27:44.210 "raid_level": "raid1", 00:27:44.210 "superblock": false, 00:27:44.210 "num_base_bdevs": 4, 00:27:44.210 "num_base_bdevs_discovered": 3, 00:27:44.210 "num_base_bdevs_operational": 3, 00:27:44.210 "base_bdevs_list": [ 00:27:44.210 { 00:27:44.210 "name": null, 00:27:44.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:44.210 "is_configured": false, 00:27:44.210 "data_offset": 0, 00:27:44.210 "data_size": 65536 00:27:44.210 }, 00:27:44.210 { 00:27:44.210 "name": "BaseBdev2", 00:27:44.210 "uuid": "a05600ee-ebec-5da6-8628-e01e72cd0caf", 00:27:44.210 "is_configured": true, 00:27:44.210 "data_offset": 0, 00:27:44.210 "data_size": 65536 00:27:44.210 }, 00:27:44.210 { 00:27:44.210 "name": "BaseBdev3", 00:27:44.210 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:44.210 "is_configured": true, 00:27:44.210 "data_offset": 0, 00:27:44.210 "data_size": 65536 00:27:44.210 }, 00:27:44.210 { 00:27:44.210 "name": "BaseBdev4", 00:27:44.210 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:44.210 "is_configured": true, 00:27:44.210 "data_offset": 0, 00:27:44.210 "data_size": 65536 00:27:44.210 } 00:27:44.210 ] 00:27:44.210 }' 00:27:44.210 05:55:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:44.210 05:55:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:44.210 05:55:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:44.210 05:55:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:44.210 05:55:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:44.469 [2024-07-26 05:55:59.320277] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:44.469 05:55:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:44.469 [2024-07-26 05:55:59.367202] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1762150 00:27:44.469 [2024-07-26 05:55:59.368762] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:44.728 [2024-07-26 05:55:59.499620] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:44.728 [2024-07-26 05:55:59.500910] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:44.987 [2024-07-26 05:55:59.712314] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:44.987 [2024-07-26 05:55:59.712885] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:45.246 [2024-07-26 05:56:00.122551] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:45.246 [2024-07-26 05:56:00.123732] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:45.505 [2024-07-26 05:56:00.345417] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:45.505 [2024-07-26 05:56:00.345668] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:45.505 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:45.505 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:45.505 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:45.505 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:45.505 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:45.505 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.505 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.764 [2024-07-26 05:56:00.562446] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:45.764 [2024-07-26 05:56:00.562761] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:45.764 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:45.764 "name": "raid_bdev1", 00:27:45.764 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:45.764 "strip_size_kb": 0, 00:27:45.764 "state": "online", 00:27:45.764 "raid_level": "raid1", 00:27:45.764 "superblock": false, 00:27:45.764 "num_base_bdevs": 4, 00:27:45.764 "num_base_bdevs_discovered": 4, 00:27:45.764 "num_base_bdevs_operational": 4, 00:27:45.764 "process": { 00:27:45.764 "type": "rebuild", 00:27:45.764 "target": "spare", 00:27:45.764 "progress": { 00:27:45.764 "blocks": 14336, 00:27:45.764 "percent": 21 00:27:45.764 } 00:27:45.764 }, 00:27:45.764 "base_bdevs_list": [ 00:27:45.764 { 00:27:45.764 "name": "spare", 00:27:45.764 "uuid": "08644430-07c6-52ef-a74d-611bc1d2a3a7", 00:27:45.764 "is_configured": true, 00:27:45.764 "data_offset": 0, 00:27:45.764 "data_size": 65536 00:27:45.764 }, 00:27:45.764 { 00:27:45.764 "name": "BaseBdev2", 00:27:45.764 "uuid": "a05600ee-ebec-5da6-8628-e01e72cd0caf", 00:27:45.764 "is_configured": true, 00:27:45.764 "data_offset": 0, 00:27:45.764 "data_size": 65536 00:27:45.764 }, 00:27:45.764 { 00:27:45.764 "name": "BaseBdev3", 00:27:45.764 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:45.764 "is_configured": true, 00:27:45.764 "data_offset": 0, 00:27:45.764 "data_size": 65536 00:27:45.764 }, 00:27:45.764 { 00:27:45.764 "name": "BaseBdev4", 00:27:45.764 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:45.764 "is_configured": true, 00:27:45.764 "data_offset": 0, 00:27:45.764 "data_size": 65536 00:27:45.764 } 00:27:45.764 ] 00:27:45.764 }' 00:27:45.764 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:45.764 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:45.764 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.023 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:46.023 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:27:46.023 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:27:46.023 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:46.023 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:27:46.023 05:56:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:46.282 [2024-07-26 05:56:00.947657] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:27:46.282 [2024-07-26 05:56:00.956968] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:27:46.282 [2024-07-26 05:56:00.958566] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:46.282 [2024-07-26 05:56:01.189056] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:27:46.541 [2024-07-26 05:56:01.299863] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1761970 00:27:46.541 [2024-07-26 05:56:01.299894] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1762150 00:27:46.541 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:27:46.541 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:27:46.541 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:46.541 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.541 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:46.541 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:46.541 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.541 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.541 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:46.800 "name": "raid_bdev1", 00:27:46.800 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:46.800 "strip_size_kb": 0, 00:27:46.800 "state": "online", 00:27:46.800 "raid_level": "raid1", 00:27:46.800 "superblock": false, 00:27:46.800 "num_base_bdevs": 4, 00:27:46.800 "num_base_bdevs_discovered": 3, 00:27:46.800 "num_base_bdevs_operational": 3, 00:27:46.800 "process": { 00:27:46.800 "type": "rebuild", 00:27:46.800 "target": "spare", 00:27:46.800 "progress": { 00:27:46.800 "blocks": 26624, 00:27:46.800 "percent": 40 00:27:46.800 } 00:27:46.800 }, 00:27:46.800 "base_bdevs_list": [ 00:27:46.800 { 00:27:46.800 "name": "spare", 00:27:46.800 "uuid": "08644430-07c6-52ef-a74d-611bc1d2a3a7", 00:27:46.800 "is_configured": true, 00:27:46.800 "data_offset": 0, 00:27:46.800 "data_size": 65536 00:27:46.800 }, 00:27:46.800 { 00:27:46.800 "name": null, 00:27:46.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:46.800 "is_configured": false, 00:27:46.800 "data_offset": 0, 00:27:46.800 "data_size": 65536 00:27:46.800 }, 00:27:46.800 { 00:27:46.800 "name": "BaseBdev3", 00:27:46.800 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:46.800 "is_configured": true, 00:27:46.800 "data_offset": 0, 00:27:46.800 "data_size": 65536 00:27:46.800 }, 00:27:46.800 { 00:27:46.800 "name": "BaseBdev4", 00:27:46.800 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:46.800 "is_configured": true, 00:27:46.800 "data_offset": 0, 00:27:46.800 "data_size": 65536 00:27:46.800 } 00:27:46.800 ] 00:27:46.800 }' 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=931 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.800 [2024-07-26 05:56:01.685329] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.800 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.058 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:47.058 "name": "raid_bdev1", 00:27:47.058 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:47.058 "strip_size_kb": 0, 00:27:47.058 "state": "online", 00:27:47.058 "raid_level": "raid1", 00:27:47.058 "superblock": false, 00:27:47.058 "num_base_bdevs": 4, 00:27:47.058 "num_base_bdevs_discovered": 3, 00:27:47.058 "num_base_bdevs_operational": 3, 00:27:47.058 "process": { 00:27:47.058 "type": "rebuild", 00:27:47.058 "target": "spare", 00:27:47.058 "progress": { 00:27:47.058 "blocks": 30720, 00:27:47.058 "percent": 46 00:27:47.058 } 00:27:47.058 }, 00:27:47.058 "base_bdevs_list": [ 00:27:47.058 { 00:27:47.058 "name": "spare", 00:27:47.058 "uuid": "08644430-07c6-52ef-a74d-611bc1d2a3a7", 00:27:47.058 "is_configured": true, 00:27:47.058 "data_offset": 0, 00:27:47.058 "data_size": 65536 00:27:47.058 }, 00:27:47.058 { 00:27:47.058 "name": null, 00:27:47.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:47.058 "is_configured": false, 00:27:47.058 "data_offset": 0, 00:27:47.058 "data_size": 65536 00:27:47.058 }, 00:27:47.058 { 00:27:47.058 "name": "BaseBdev3", 00:27:47.058 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:47.058 "is_configured": true, 00:27:47.058 "data_offset": 0, 00:27:47.058 "data_size": 65536 00:27:47.058 }, 00:27:47.058 { 00:27:47.058 "name": "BaseBdev4", 00:27:47.058 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:47.058 "is_configured": true, 00:27:47.058 "data_offset": 0, 00:27:47.058 "data_size": 65536 00:27:47.058 } 00:27:47.058 ] 00:27:47.058 }' 00:27:47.058 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:47.317 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:47.317 05:56:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:47.317 05:56:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:47.317 05:56:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:47.588 [2024-07-26 05:56:02.273319] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:27:48.162 [2024-07-26 05:56:02.796717] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:27:48.162 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:48.162 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:48.162 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:48.162 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:48.162 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:48.162 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:48.162 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.162 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.421 [2024-07-26 05:56:03.128983] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:27:48.421 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:48.421 "name": "raid_bdev1", 00:27:48.421 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:48.421 "strip_size_kb": 0, 00:27:48.421 "state": "online", 00:27:48.421 "raid_level": "raid1", 00:27:48.421 "superblock": false, 00:27:48.421 "num_base_bdevs": 4, 00:27:48.421 "num_base_bdevs_discovered": 3, 00:27:48.421 "num_base_bdevs_operational": 3, 00:27:48.421 "process": { 00:27:48.421 "type": "rebuild", 00:27:48.421 "target": "spare", 00:27:48.421 "progress": { 00:27:48.421 "blocks": 53248, 00:27:48.421 "percent": 81 00:27:48.421 } 00:27:48.421 }, 00:27:48.421 "base_bdevs_list": [ 00:27:48.421 { 00:27:48.421 "name": "spare", 00:27:48.421 "uuid": "08644430-07c6-52ef-a74d-611bc1d2a3a7", 00:27:48.421 "is_configured": true, 00:27:48.421 "data_offset": 0, 00:27:48.421 "data_size": 65536 00:27:48.421 }, 00:27:48.421 { 00:27:48.421 "name": null, 00:27:48.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.421 "is_configured": false, 00:27:48.421 "data_offset": 0, 00:27:48.421 "data_size": 65536 00:27:48.421 }, 00:27:48.421 { 00:27:48.421 "name": "BaseBdev3", 00:27:48.421 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:48.421 "is_configured": true, 00:27:48.421 "data_offset": 0, 00:27:48.421 "data_size": 65536 00:27:48.421 }, 00:27:48.421 { 00:27:48.421 "name": "BaseBdev4", 00:27:48.421 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:48.421 "is_configured": true, 00:27:48.421 "data_offset": 0, 00:27:48.421 "data_size": 65536 00:27:48.421 } 00:27:48.421 ] 00:27:48.421 }' 00:27:48.421 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:48.421 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:48.421 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:48.679 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:48.679 05:56:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:49.247 [2024-07-26 05:56:03.912847] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:49.247 [2024-07-26 05:56:04.013082] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:49.247 [2024-07-26 05:56:04.015460] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:49.505 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:49.505 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:49.505 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:49.505 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:49.505 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:49.505 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:49.505 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.505 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.763 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:49.763 "name": "raid_bdev1", 00:27:49.763 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:49.763 "strip_size_kb": 0, 00:27:49.763 "state": "online", 00:27:49.763 "raid_level": "raid1", 00:27:49.763 "superblock": false, 00:27:49.763 "num_base_bdevs": 4, 00:27:49.763 "num_base_bdevs_discovered": 3, 00:27:49.763 "num_base_bdevs_operational": 3, 00:27:49.763 "base_bdevs_list": [ 00:27:49.763 { 00:27:49.763 "name": "spare", 00:27:49.763 "uuid": "08644430-07c6-52ef-a74d-611bc1d2a3a7", 00:27:49.763 "is_configured": true, 00:27:49.763 "data_offset": 0, 00:27:49.763 "data_size": 65536 00:27:49.763 }, 00:27:49.763 { 00:27:49.763 "name": null, 00:27:49.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:49.763 "is_configured": false, 00:27:49.763 "data_offset": 0, 00:27:49.763 "data_size": 65536 00:27:49.763 }, 00:27:49.763 { 00:27:49.763 "name": "BaseBdev3", 00:27:49.763 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:49.763 "is_configured": true, 00:27:49.763 "data_offset": 0, 00:27:49.763 "data_size": 65536 00:27:49.763 }, 00:27:49.763 { 00:27:49.763 "name": "BaseBdev4", 00:27:49.763 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:49.763 "is_configured": true, 00:27:49.763 "data_offset": 0, 00:27:49.763 "data_size": 65536 00:27:49.763 } 00:27:49.763 ] 00:27:49.763 }' 00:27:49.763 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.023 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.342 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:50.342 "name": "raid_bdev1", 00:27:50.342 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:50.342 "strip_size_kb": 0, 00:27:50.342 "state": "online", 00:27:50.342 "raid_level": "raid1", 00:27:50.342 "superblock": false, 00:27:50.342 "num_base_bdevs": 4, 00:27:50.342 "num_base_bdevs_discovered": 3, 00:27:50.342 "num_base_bdevs_operational": 3, 00:27:50.342 "base_bdevs_list": [ 00:27:50.342 { 00:27:50.342 "name": "spare", 00:27:50.342 "uuid": "08644430-07c6-52ef-a74d-611bc1d2a3a7", 00:27:50.342 "is_configured": true, 00:27:50.342 "data_offset": 0, 00:27:50.342 "data_size": 65536 00:27:50.342 }, 00:27:50.342 { 00:27:50.342 "name": null, 00:27:50.342 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.342 "is_configured": false, 00:27:50.342 "data_offset": 0, 00:27:50.342 "data_size": 65536 00:27:50.342 }, 00:27:50.342 { 00:27:50.342 "name": "BaseBdev3", 00:27:50.342 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:50.342 "is_configured": true, 00:27:50.342 "data_offset": 0, 00:27:50.342 "data_size": 65536 00:27:50.342 }, 00:27:50.342 { 00:27:50.342 "name": "BaseBdev4", 00:27:50.342 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:50.342 "is_configured": true, 00:27:50.342 "data_offset": 0, 00:27:50.342 "data_size": 65536 00:27:50.342 } 00:27:50.342 ] 00:27:50.342 }' 00:27:50.342 05:56:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.342 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.601 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.601 "name": "raid_bdev1", 00:27:50.601 "uuid": "292a362c-afa7-40ee-bbbc-faad9a791152", 00:27:50.601 "strip_size_kb": 0, 00:27:50.601 "state": "online", 00:27:50.601 "raid_level": "raid1", 00:27:50.601 "superblock": false, 00:27:50.601 "num_base_bdevs": 4, 00:27:50.601 "num_base_bdevs_discovered": 3, 00:27:50.601 "num_base_bdevs_operational": 3, 00:27:50.601 "base_bdevs_list": [ 00:27:50.601 { 00:27:50.601 "name": "spare", 00:27:50.601 "uuid": "08644430-07c6-52ef-a74d-611bc1d2a3a7", 00:27:50.601 "is_configured": true, 00:27:50.601 "data_offset": 0, 00:27:50.601 "data_size": 65536 00:27:50.601 }, 00:27:50.601 { 00:27:50.601 "name": null, 00:27:50.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.601 "is_configured": false, 00:27:50.601 "data_offset": 0, 00:27:50.601 "data_size": 65536 00:27:50.601 }, 00:27:50.601 { 00:27:50.601 "name": "BaseBdev3", 00:27:50.601 "uuid": "5ea48b7d-0c94-5696-a97b-016146227315", 00:27:50.601 "is_configured": true, 00:27:50.601 "data_offset": 0, 00:27:50.601 "data_size": 65536 00:27:50.601 }, 00:27:50.601 { 00:27:50.601 "name": "BaseBdev4", 00:27:50.601 "uuid": "6890ddca-9b75-5b54-866b-fbad03651b19", 00:27:50.601 "is_configured": true, 00:27:50.601 "data_offset": 0, 00:27:50.601 "data_size": 65536 00:27:50.601 } 00:27:50.601 ] 00:27:50.601 }' 00:27:50.601 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.601 05:56:05 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:51.168 05:56:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:51.427 [2024-07-26 05:56:06.148077] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:51.427 [2024-07-26 05:56:06.148108] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:51.427 00:27:51.427 Latency(us) 00:27:51.427 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:51.427 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:27:51.427 raid_bdev1 : 11.32 89.10 267.31 0.00 0.00 15251.35 304.53 123093.70 00:27:51.427 =================================================================================================================== 00:27:51.427 Total : 89.10 267.31 0.00 0.00 15251.35 304.53 123093.70 00:27:51.427 [2024-07-26 05:56:06.236230] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:51.427 [2024-07-26 05:56:06.236259] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:51.427 [2024-07-26 05:56:06.236352] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:51.427 [2024-07-26 05:56:06.236363] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x175b8a0 name raid_bdev1, state offline 00:27:51.427 0 00:27:51.427 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.427 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:51.686 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:27:51.945 /dev/nbd0 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:51.945 1+0 records in 00:27:51.945 1+0 records out 00:27:51.945 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274948 s, 14.9 MB/s 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:27:51.945 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:51.946 05:56:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:27:52.205 /dev/nbd1 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:52.205 1+0 records in 00:27:52.205 1+0 records out 00:27:52.205 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276655 s, 14.8 MB/s 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:52.205 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:52.464 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:27:52.723 /dev/nbd1 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:52.982 1+0 records in 00:27:52.982 1+0 records out 00:27:52.982 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290081 s, 14.1 MB/s 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:52.982 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:53.241 05:56:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1257864 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1257864 ']' 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1257864 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1257864 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1257864' 00:27:53.501 killing process with pid 1257864 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1257864 00:27:53.501 Received shutdown signal, test time was about 13.390763 seconds 00:27:53.501 00:27:53.501 Latency(us) 00:27:53.501 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:53.501 =================================================================================================================== 00:27:53.501 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:53.501 [2024-07-26 05:56:08.303991] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:53.501 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1257864 00:27:53.501 [2024-07-26 05:56:08.345039] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:27:53.761 00:27:53.761 real 0m19.066s 00:27:53.761 user 0m29.621s 00:27:53.761 sys 0m3.389s 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:53.761 ************************************ 00:27:53.761 END TEST raid_rebuild_test_io 00:27:53.761 ************************************ 00:27:53.761 05:56:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:53.761 05:56:08 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:27:53.761 05:56:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:53.761 05:56:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:53.761 05:56:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:53.761 ************************************ 00:27:53.761 START TEST raid_rebuild_test_sb_io 00:27:53.761 ************************************ 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1260580 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1260580 /var/tmp/spdk-raid.sock 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1260580 ']' 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:53.761 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:53.761 05:56:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:54.021 [2024-07-26 05:56:08.720950] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:27:54.021 [2024-07-26 05:56:08.721011] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1260580 ] 00:27:54.021 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:54.021 Zero copy mechanism will not be used. 00:27:54.021 [2024-07-26 05:56:08.850459] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:54.280 [2024-07-26 05:56:08.949238] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:54.280 [2024-07-26 05:56:09.018523] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:54.280 [2024-07-26 05:56:09.018559] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:54.847 05:56:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:54.847 05:56:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:27:54.847 05:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:54.847 05:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:55.105 BaseBdev1_malloc 00:27:55.105 05:56:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:55.365 [2024-07-26 05:56:10.056188] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:55.365 [2024-07-26 05:56:10.056242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:55.365 [2024-07-26 05:56:10.056266] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd3fd40 00:27:55.365 [2024-07-26 05:56:10.056280] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:55.365 [2024-07-26 05:56:10.058040] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:55.365 [2024-07-26 05:56:10.058069] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:55.365 BaseBdev1 00:27:55.365 05:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:55.365 05:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:55.932 BaseBdev2_malloc 00:27:55.932 05:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:55.932 [2024-07-26 05:56:10.820294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:55.932 [2024-07-26 05:56:10.820342] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:55.932 [2024-07-26 05:56:10.820369] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd40860 00:27:55.932 [2024-07-26 05:56:10.820381] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:55.932 [2024-07-26 05:56:10.821977] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:55.932 [2024-07-26 05:56:10.822006] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:55.932 BaseBdev2 00:27:55.932 05:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:55.932 05:56:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:56.189 BaseBdev3_malloc 00:27:56.189 05:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:27:56.446 [2024-07-26 05:56:11.315451] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:27:56.446 [2024-07-26 05:56:11.315500] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:56.446 [2024-07-26 05:56:11.315523] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeed8f0 00:27:56.446 [2024-07-26 05:56:11.315536] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:56.446 [2024-07-26 05:56:11.317140] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:56.446 [2024-07-26 05:56:11.317168] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:56.446 BaseBdev3 00:27:56.446 05:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:56.446 05:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:56.704 BaseBdev4_malloc 00:27:56.704 05:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:27:56.961 [2024-07-26 05:56:11.821339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:27:56.961 [2024-07-26 05:56:11.821385] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:56.961 [2024-07-26 05:56:11.821405] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeecad0 00:27:56.961 [2024-07-26 05:56:11.821418] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:56.961 [2024-07-26 05:56:11.822853] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:56.961 [2024-07-26 05:56:11.822879] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:56.961 BaseBdev4 00:27:56.962 05:56:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:57.220 spare_malloc 00:27:57.220 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:57.479 spare_delay 00:27:57.479 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:57.737 [2024-07-26 05:56:12.591926] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:57.737 [2024-07-26 05:56:12.591972] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:57.737 [2024-07-26 05:56:12.591993] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef15b0 00:27:57.737 [2024-07-26 05:56:12.592005] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:57.737 [2024-07-26 05:56:12.593620] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:57.737 [2024-07-26 05:56:12.593657] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:57.737 spare 00:27:57.737 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:57.996 [2024-07-26 05:56:12.836612] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:57.996 [2024-07-26 05:56:12.837972] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:57.996 [2024-07-26 05:56:12.838029] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:57.996 [2024-07-26 05:56:12.838075] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:57.996 [2024-07-26 05:56:12.838278] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xe708a0 00:27:57.996 [2024-07-26 05:56:12.838294] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:57.996 [2024-07-26 05:56:12.838503] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xeeae10 00:27:57.996 [2024-07-26 05:56:12.838668] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe708a0 00:27:57.996 [2024-07-26 05:56:12.838679] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe708a0 00:27:57.996 [2024-07-26 05:56:12.838781] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.996 05:56:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:58.254 05:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:58.254 "name": "raid_bdev1", 00:27:58.254 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:27:58.254 "strip_size_kb": 0, 00:27:58.254 "state": "online", 00:27:58.254 "raid_level": "raid1", 00:27:58.254 "superblock": true, 00:27:58.254 "num_base_bdevs": 4, 00:27:58.254 "num_base_bdevs_discovered": 4, 00:27:58.254 "num_base_bdevs_operational": 4, 00:27:58.254 "base_bdevs_list": [ 00:27:58.254 { 00:27:58.254 "name": "BaseBdev1", 00:27:58.254 "uuid": "d84d543d-a006-54ee-a93b-72a40149321a", 00:27:58.254 "is_configured": true, 00:27:58.254 "data_offset": 2048, 00:27:58.254 "data_size": 63488 00:27:58.254 }, 00:27:58.254 { 00:27:58.254 "name": "BaseBdev2", 00:27:58.254 "uuid": "99e58602-c17f-5db4-8719-81d0d95a6ab3", 00:27:58.254 "is_configured": true, 00:27:58.254 "data_offset": 2048, 00:27:58.254 "data_size": 63488 00:27:58.254 }, 00:27:58.254 { 00:27:58.254 "name": "BaseBdev3", 00:27:58.254 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:27:58.254 "is_configured": true, 00:27:58.254 "data_offset": 2048, 00:27:58.254 "data_size": 63488 00:27:58.254 }, 00:27:58.254 { 00:27:58.254 "name": "BaseBdev4", 00:27:58.254 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:27:58.254 "is_configured": true, 00:27:58.254 "data_offset": 2048, 00:27:58.254 "data_size": 63488 00:27:58.254 } 00:27:58.254 ] 00:27:58.254 }' 00:27:58.254 05:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:58.254 05:56:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:58.819 05:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:58.820 05:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:59.077 [2024-07-26 05:56:13.903764] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:59.077 05:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:27:59.077 05:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.077 05:56:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:59.335 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:27:59.335 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:27:59.335 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:59.335 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:59.593 [2024-07-26 05:56:14.286548] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd3f670 00:27:59.593 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:59.593 Zero copy mechanism will not be used. 00:27:59.593 Running I/O for 60 seconds... 00:27:59.593 [2024-07-26 05:56:14.407549] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:59.593 [2024-07-26 05:56:14.415801] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xd3f670 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.593 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.851 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:59.851 "name": "raid_bdev1", 00:27:59.851 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:27:59.851 "strip_size_kb": 0, 00:27:59.851 "state": "online", 00:27:59.851 "raid_level": "raid1", 00:27:59.851 "superblock": true, 00:27:59.851 "num_base_bdevs": 4, 00:27:59.851 "num_base_bdevs_discovered": 3, 00:27:59.851 "num_base_bdevs_operational": 3, 00:27:59.851 "base_bdevs_list": [ 00:27:59.851 { 00:27:59.851 "name": null, 00:27:59.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:59.851 "is_configured": false, 00:27:59.851 "data_offset": 2048, 00:27:59.851 "data_size": 63488 00:27:59.851 }, 00:27:59.851 { 00:27:59.851 "name": "BaseBdev2", 00:27:59.851 "uuid": "99e58602-c17f-5db4-8719-81d0d95a6ab3", 00:27:59.851 "is_configured": true, 00:27:59.851 "data_offset": 2048, 00:27:59.851 "data_size": 63488 00:27:59.851 }, 00:27:59.851 { 00:27:59.851 "name": "BaseBdev3", 00:27:59.851 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:27:59.851 "is_configured": true, 00:27:59.851 "data_offset": 2048, 00:27:59.851 "data_size": 63488 00:27:59.851 }, 00:27:59.851 { 00:27:59.851 "name": "BaseBdev4", 00:27:59.851 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:27:59.851 "is_configured": true, 00:27:59.851 "data_offset": 2048, 00:27:59.851 "data_size": 63488 00:27:59.851 } 00:27:59.851 ] 00:27:59.851 }' 00:27:59.851 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:59.851 05:56:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:00.785 05:56:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:00.785 [2024-07-26 05:56:15.557194] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:00.785 05:56:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:00.785 [2024-07-26 05:56:15.606506] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe72c00 00:28:00.785 [2024-07-26 05:56:15.609015] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:01.043 [2024-07-26 05:56:15.738574] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:01.043 [2024-07-26 05:56:15.740020] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:01.302 [2024-07-26 05:56:15.989738] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:01.302 [2024-07-26 05:56:15.990457] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:01.560 [2024-07-26 05:56:16.356997] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:28:01.818 [2024-07-26 05:56:16.529416] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:28:01.818 [2024-07-26 05:56:16.529749] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:28:01.818 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:01.818 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:01.818 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:01.818 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:01.818 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:01.818 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.818 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.076 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:02.076 "name": "raid_bdev1", 00:28:02.076 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:02.076 "strip_size_kb": 0, 00:28:02.076 "state": "online", 00:28:02.076 "raid_level": "raid1", 00:28:02.076 "superblock": true, 00:28:02.076 "num_base_bdevs": 4, 00:28:02.076 "num_base_bdevs_discovered": 4, 00:28:02.076 "num_base_bdevs_operational": 4, 00:28:02.076 "process": { 00:28:02.076 "type": "rebuild", 00:28:02.076 "target": "spare", 00:28:02.076 "progress": { 00:28:02.076 "blocks": 14336, 00:28:02.076 "percent": 22 00:28:02.076 } 00:28:02.076 }, 00:28:02.076 "base_bdevs_list": [ 00:28:02.076 { 00:28:02.076 "name": "spare", 00:28:02.076 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:02.076 "is_configured": true, 00:28:02.076 "data_offset": 2048, 00:28:02.076 "data_size": 63488 00:28:02.076 }, 00:28:02.076 { 00:28:02.076 "name": "BaseBdev2", 00:28:02.076 "uuid": "99e58602-c17f-5db4-8719-81d0d95a6ab3", 00:28:02.076 "is_configured": true, 00:28:02.076 "data_offset": 2048, 00:28:02.076 "data_size": 63488 00:28:02.076 }, 00:28:02.076 { 00:28:02.076 "name": "BaseBdev3", 00:28:02.076 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:02.076 "is_configured": true, 00:28:02.076 "data_offset": 2048, 00:28:02.076 "data_size": 63488 00:28:02.076 }, 00:28:02.076 { 00:28:02.076 "name": "BaseBdev4", 00:28:02.076 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:02.076 "is_configured": true, 00:28:02.076 "data_offset": 2048, 00:28:02.076 "data_size": 63488 00:28:02.076 } 00:28:02.076 ] 00:28:02.076 }' 00:28:02.076 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:02.076 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:02.076 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:02.076 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:02.076 05:56:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:02.334 [2024-07-26 05:56:17.196959] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:02.335 [2024-07-26 05:56:17.233717] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:28:02.593 [2024-07-26 05:56:17.342019] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:02.593 [2024-07-26 05:56:17.354620] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:02.593 [2024-07-26 05:56:17.354664] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:02.593 [2024-07-26 05:56:17.354675] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:02.593 [2024-07-26 05:56:17.386224] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xd3f670 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.593 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.852 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:02.852 "name": "raid_bdev1", 00:28:02.852 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:02.852 "strip_size_kb": 0, 00:28:02.852 "state": "online", 00:28:02.852 "raid_level": "raid1", 00:28:02.852 "superblock": true, 00:28:02.852 "num_base_bdevs": 4, 00:28:02.852 "num_base_bdevs_discovered": 3, 00:28:02.852 "num_base_bdevs_operational": 3, 00:28:02.852 "base_bdevs_list": [ 00:28:02.852 { 00:28:02.852 "name": null, 00:28:02.852 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:02.852 "is_configured": false, 00:28:02.852 "data_offset": 2048, 00:28:02.852 "data_size": 63488 00:28:02.852 }, 00:28:02.852 { 00:28:02.852 "name": "BaseBdev2", 00:28:02.852 "uuid": "99e58602-c17f-5db4-8719-81d0d95a6ab3", 00:28:02.852 "is_configured": true, 00:28:02.852 "data_offset": 2048, 00:28:02.852 "data_size": 63488 00:28:02.852 }, 00:28:02.852 { 00:28:02.852 "name": "BaseBdev3", 00:28:02.852 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:02.852 "is_configured": true, 00:28:02.852 "data_offset": 2048, 00:28:02.852 "data_size": 63488 00:28:02.852 }, 00:28:02.852 { 00:28:02.852 "name": "BaseBdev4", 00:28:02.852 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:02.852 "is_configured": true, 00:28:02.852 "data_offset": 2048, 00:28:02.852 "data_size": 63488 00:28:02.852 } 00:28:02.852 ] 00:28:02.852 }' 00:28:02.852 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:02.852 05:56:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:03.786 "name": "raid_bdev1", 00:28:03.786 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:03.786 "strip_size_kb": 0, 00:28:03.786 "state": "online", 00:28:03.786 "raid_level": "raid1", 00:28:03.786 "superblock": true, 00:28:03.786 "num_base_bdevs": 4, 00:28:03.786 "num_base_bdevs_discovered": 3, 00:28:03.786 "num_base_bdevs_operational": 3, 00:28:03.786 "base_bdevs_list": [ 00:28:03.786 { 00:28:03.786 "name": null, 00:28:03.786 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:03.786 "is_configured": false, 00:28:03.786 "data_offset": 2048, 00:28:03.786 "data_size": 63488 00:28:03.786 }, 00:28:03.786 { 00:28:03.786 "name": "BaseBdev2", 00:28:03.786 "uuid": "99e58602-c17f-5db4-8719-81d0d95a6ab3", 00:28:03.786 "is_configured": true, 00:28:03.786 "data_offset": 2048, 00:28:03.786 "data_size": 63488 00:28:03.786 }, 00:28:03.786 { 00:28:03.786 "name": "BaseBdev3", 00:28:03.786 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:03.786 "is_configured": true, 00:28:03.786 "data_offset": 2048, 00:28:03.786 "data_size": 63488 00:28:03.786 }, 00:28:03.786 { 00:28:03.786 "name": "BaseBdev4", 00:28:03.786 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:03.786 "is_configured": true, 00:28:03.786 "data_offset": 2048, 00:28:03.786 "data_size": 63488 00:28:03.786 } 00:28:03.786 ] 00:28:03.786 }' 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:03.786 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:03.787 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:03.787 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:04.045 [2024-07-26 05:56:18.864627] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:04.045 05:56:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:04.045 [2024-07-26 05:56:18.939891] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe749b0 00:28:04.045 [2024-07-26 05:56:18.941458] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:04.302 [2024-07-26 05:56:19.089044] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:04.302 [2024-07-26 05:56:19.090496] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:28:04.560 [2024-07-26 05:56:19.324543] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:04.560 [2024-07-26 05:56:19.324861] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:28:05.126 [2024-07-26 05:56:19.802885] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:28:05.126 05:56:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:05.126 05:56:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:05.126 05:56:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:05.126 05:56:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:05.126 05:56:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:05.126 05:56:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.126 05:56:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:05.384 [2024-07-26 05:56:20.184875] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:28:05.384 [2024-07-26 05:56:20.185482] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:28:05.384 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:05.384 "name": "raid_bdev1", 00:28:05.384 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:05.384 "strip_size_kb": 0, 00:28:05.384 "state": "online", 00:28:05.384 "raid_level": "raid1", 00:28:05.384 "superblock": true, 00:28:05.384 "num_base_bdevs": 4, 00:28:05.384 "num_base_bdevs_discovered": 4, 00:28:05.384 "num_base_bdevs_operational": 4, 00:28:05.384 "process": { 00:28:05.384 "type": "rebuild", 00:28:05.384 "target": "spare", 00:28:05.384 "progress": { 00:28:05.384 "blocks": 14336, 00:28:05.384 "percent": 22 00:28:05.384 } 00:28:05.384 }, 00:28:05.384 "base_bdevs_list": [ 00:28:05.384 { 00:28:05.384 "name": "spare", 00:28:05.384 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:05.384 "is_configured": true, 00:28:05.384 "data_offset": 2048, 00:28:05.384 "data_size": 63488 00:28:05.384 }, 00:28:05.384 { 00:28:05.384 "name": "BaseBdev2", 00:28:05.384 "uuid": "99e58602-c17f-5db4-8719-81d0d95a6ab3", 00:28:05.384 "is_configured": true, 00:28:05.384 "data_offset": 2048, 00:28:05.384 "data_size": 63488 00:28:05.384 }, 00:28:05.384 { 00:28:05.384 "name": "BaseBdev3", 00:28:05.384 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:05.384 "is_configured": true, 00:28:05.384 "data_offset": 2048, 00:28:05.384 "data_size": 63488 00:28:05.384 }, 00:28:05.384 { 00:28:05.384 "name": "BaseBdev4", 00:28:05.384 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:05.384 "is_configured": true, 00:28:05.384 "data_offset": 2048, 00:28:05.384 "data_size": 63488 00:28:05.384 } 00:28:05.384 ] 00:28:05.384 }' 00:28:05.384 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:05.384 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:05.384 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:05.642 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:05.642 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:05.642 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:05.642 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:05.642 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:28:05.642 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:05.642 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:28:05.642 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:28:05.642 [2024-07-26 05:56:20.536839] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:05.900 [2024-07-26 05:56:20.551105] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:28:05.900 [2024-07-26 05:56:20.751654] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xd3f670 00:28:05.900 [2024-07-26 05:56:20.751682] bdev_raid.c:1945:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xe749b0 00:28:05.900 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:28:05.900 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:28:05.900 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:05.900 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:05.900 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:05.900 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:05.900 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:05.900 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.900 05:56:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.158 [2024-07-26 05:56:20.891303] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:28:06.158 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:06.158 "name": "raid_bdev1", 00:28:06.158 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:06.158 "strip_size_kb": 0, 00:28:06.158 "state": "online", 00:28:06.158 "raid_level": "raid1", 00:28:06.158 "superblock": true, 00:28:06.158 "num_base_bdevs": 4, 00:28:06.158 "num_base_bdevs_discovered": 3, 00:28:06.158 "num_base_bdevs_operational": 3, 00:28:06.158 "process": { 00:28:06.158 "type": "rebuild", 00:28:06.158 "target": "spare", 00:28:06.158 "progress": { 00:28:06.158 "blocks": 22528, 00:28:06.158 "percent": 35 00:28:06.158 } 00:28:06.158 }, 00:28:06.158 "base_bdevs_list": [ 00:28:06.158 { 00:28:06.158 "name": "spare", 00:28:06.158 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:06.158 "is_configured": true, 00:28:06.158 "data_offset": 2048, 00:28:06.158 "data_size": 63488 00:28:06.158 }, 00:28:06.158 { 00:28:06.158 "name": null, 00:28:06.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:06.158 "is_configured": false, 00:28:06.158 "data_offset": 2048, 00:28:06.158 "data_size": 63488 00:28:06.158 }, 00:28:06.158 { 00:28:06.158 "name": "BaseBdev3", 00:28:06.158 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:06.158 "is_configured": true, 00:28:06.158 "data_offset": 2048, 00:28:06.158 "data_size": 63488 00:28:06.158 }, 00:28:06.158 { 00:28:06.158 "name": "BaseBdev4", 00:28:06.158 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:06.158 "is_configured": true, 00:28:06.158 "data_offset": 2048, 00:28:06.158 "data_size": 63488 00:28:06.158 } 00:28:06.158 ] 00:28:06.158 }' 00:28:06.158 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=951 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:06.416 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.417 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.417 [2024-07-26 05:56:21.236834] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:28:06.675 [2024-07-26 05:56:21.358540] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:28:06.675 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:06.675 "name": "raid_bdev1", 00:28:06.675 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:06.675 "strip_size_kb": 0, 00:28:06.675 "state": "online", 00:28:06.675 "raid_level": "raid1", 00:28:06.675 "superblock": true, 00:28:06.675 "num_base_bdevs": 4, 00:28:06.675 "num_base_bdevs_discovered": 3, 00:28:06.675 "num_base_bdevs_operational": 3, 00:28:06.675 "process": { 00:28:06.675 "type": "rebuild", 00:28:06.675 "target": "spare", 00:28:06.675 "progress": { 00:28:06.675 "blocks": 28672, 00:28:06.675 "percent": 45 00:28:06.675 } 00:28:06.675 }, 00:28:06.675 "base_bdevs_list": [ 00:28:06.675 { 00:28:06.675 "name": "spare", 00:28:06.675 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:06.675 "is_configured": true, 00:28:06.675 "data_offset": 2048, 00:28:06.675 "data_size": 63488 00:28:06.675 }, 00:28:06.675 { 00:28:06.675 "name": null, 00:28:06.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:06.675 "is_configured": false, 00:28:06.675 "data_offset": 2048, 00:28:06.675 "data_size": 63488 00:28:06.675 }, 00:28:06.675 { 00:28:06.675 "name": "BaseBdev3", 00:28:06.675 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:06.675 "is_configured": true, 00:28:06.675 "data_offset": 2048, 00:28:06.675 "data_size": 63488 00:28:06.675 }, 00:28:06.675 { 00:28:06.675 "name": "BaseBdev4", 00:28:06.675 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:06.675 "is_configured": true, 00:28:06.675 "data_offset": 2048, 00:28:06.675 "data_size": 63488 00:28:06.675 } 00:28:06.675 ] 00:28:06.675 }' 00:28:06.675 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:06.675 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:06.675 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:06.675 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:06.675 05:56:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:07.619 [2024-07-26 05:56:22.279450] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:28:07.619 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:07.619 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:07.619 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:07.619 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:07.619 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:07.619 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:07.619 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.619 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.928 [2024-07-26 05:56:22.650501] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:28:07.928 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:07.928 "name": "raid_bdev1", 00:28:07.928 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:07.928 "strip_size_kb": 0, 00:28:07.928 "state": "online", 00:28:07.928 "raid_level": "raid1", 00:28:07.928 "superblock": true, 00:28:07.928 "num_base_bdevs": 4, 00:28:07.928 "num_base_bdevs_discovered": 3, 00:28:07.928 "num_base_bdevs_operational": 3, 00:28:07.928 "process": { 00:28:07.928 "type": "rebuild", 00:28:07.928 "target": "spare", 00:28:07.928 "progress": { 00:28:07.928 "blocks": 51200, 00:28:07.928 "percent": 80 00:28:07.928 } 00:28:07.928 }, 00:28:07.928 "base_bdevs_list": [ 00:28:07.928 { 00:28:07.928 "name": "spare", 00:28:07.928 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:07.928 "is_configured": true, 00:28:07.928 "data_offset": 2048, 00:28:07.928 "data_size": 63488 00:28:07.928 }, 00:28:07.928 { 00:28:07.928 "name": null, 00:28:07.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.928 "is_configured": false, 00:28:07.928 "data_offset": 2048, 00:28:07.928 "data_size": 63488 00:28:07.928 }, 00:28:07.928 { 00:28:07.928 "name": "BaseBdev3", 00:28:07.928 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:07.928 "is_configured": true, 00:28:07.928 "data_offset": 2048, 00:28:07.928 "data_size": 63488 00:28:07.928 }, 00:28:07.928 { 00:28:07.928 "name": "BaseBdev4", 00:28:07.928 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:07.928 "is_configured": true, 00:28:07.928 "data_offset": 2048, 00:28:07.928 "data_size": 63488 00:28:07.928 } 00:28:07.928 ] 00:28:07.928 }' 00:28:07.928 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:07.928 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:07.928 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:07.928 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:07.928 05:56:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:08.186 [2024-07-26 05:56:22.881835] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:28:08.444 [2024-07-26 05:56:23.133208] bdev_raid.c: 851:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:28:08.702 [2024-07-26 05:56:23.476569] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:08.702 [2024-07-26 05:56:23.584814] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:08.702 [2024-07-26 05:56:23.586993] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:08.961 05:56:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:08.961 05:56:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:08.961 05:56:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:08.961 05:56:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:08.961 05:56:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:08.961 05:56:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:08.961 05:56:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.961 05:56:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:09.219 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:09.219 "name": "raid_bdev1", 00:28:09.219 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:09.219 "strip_size_kb": 0, 00:28:09.219 "state": "online", 00:28:09.219 "raid_level": "raid1", 00:28:09.219 "superblock": true, 00:28:09.219 "num_base_bdevs": 4, 00:28:09.219 "num_base_bdevs_discovered": 3, 00:28:09.219 "num_base_bdevs_operational": 3, 00:28:09.219 "base_bdevs_list": [ 00:28:09.219 { 00:28:09.219 "name": "spare", 00:28:09.219 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:09.219 "is_configured": true, 00:28:09.219 "data_offset": 2048, 00:28:09.219 "data_size": 63488 00:28:09.219 }, 00:28:09.219 { 00:28:09.219 "name": null, 00:28:09.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.219 "is_configured": false, 00:28:09.219 "data_offset": 2048, 00:28:09.219 "data_size": 63488 00:28:09.219 }, 00:28:09.219 { 00:28:09.219 "name": "BaseBdev3", 00:28:09.219 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:09.219 "is_configured": true, 00:28:09.219 "data_offset": 2048, 00:28:09.219 "data_size": 63488 00:28:09.219 }, 00:28:09.219 { 00:28:09.219 "name": "BaseBdev4", 00:28:09.219 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:09.219 "is_configured": true, 00:28:09.219 "data_offset": 2048, 00:28:09.219 "data_size": 63488 00:28:09.219 } 00:28:09.219 ] 00:28:09.219 }' 00:28:09.219 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:09.478 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:09.737 "name": "raid_bdev1", 00:28:09.737 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:09.737 "strip_size_kb": 0, 00:28:09.737 "state": "online", 00:28:09.737 "raid_level": "raid1", 00:28:09.737 "superblock": true, 00:28:09.737 "num_base_bdevs": 4, 00:28:09.737 "num_base_bdevs_discovered": 3, 00:28:09.737 "num_base_bdevs_operational": 3, 00:28:09.737 "base_bdevs_list": [ 00:28:09.737 { 00:28:09.737 "name": "spare", 00:28:09.737 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:09.737 "is_configured": true, 00:28:09.737 "data_offset": 2048, 00:28:09.737 "data_size": 63488 00:28:09.737 }, 00:28:09.737 { 00:28:09.737 "name": null, 00:28:09.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.737 "is_configured": false, 00:28:09.737 "data_offset": 2048, 00:28:09.737 "data_size": 63488 00:28:09.737 }, 00:28:09.737 { 00:28:09.737 "name": "BaseBdev3", 00:28:09.737 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:09.737 "is_configured": true, 00:28:09.737 "data_offset": 2048, 00:28:09.737 "data_size": 63488 00:28:09.737 }, 00:28:09.737 { 00:28:09.737 "name": "BaseBdev4", 00:28:09.737 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:09.737 "is_configured": true, 00:28:09.737 "data_offset": 2048, 00:28:09.737 "data_size": 63488 00:28:09.737 } 00:28:09.737 ] 00:28:09.737 }' 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:09.737 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:09.996 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:09.996 "name": "raid_bdev1", 00:28:09.996 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:09.996 "strip_size_kb": 0, 00:28:09.996 "state": "online", 00:28:09.996 "raid_level": "raid1", 00:28:09.996 "superblock": true, 00:28:09.996 "num_base_bdevs": 4, 00:28:09.996 "num_base_bdevs_discovered": 3, 00:28:09.996 "num_base_bdevs_operational": 3, 00:28:09.996 "base_bdevs_list": [ 00:28:09.996 { 00:28:09.996 "name": "spare", 00:28:09.996 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:09.996 "is_configured": true, 00:28:09.996 "data_offset": 2048, 00:28:09.996 "data_size": 63488 00:28:09.996 }, 00:28:09.996 { 00:28:09.996 "name": null, 00:28:09.996 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.996 "is_configured": false, 00:28:09.996 "data_offset": 2048, 00:28:09.996 "data_size": 63488 00:28:09.996 }, 00:28:09.996 { 00:28:09.996 "name": "BaseBdev3", 00:28:09.996 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:09.996 "is_configured": true, 00:28:09.996 "data_offset": 2048, 00:28:09.996 "data_size": 63488 00:28:09.996 }, 00:28:09.996 { 00:28:09.996 "name": "BaseBdev4", 00:28:09.996 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:09.996 "is_configured": true, 00:28:09.996 "data_offset": 2048, 00:28:09.996 "data_size": 63488 00:28:09.996 } 00:28:09.996 ] 00:28:09.996 }' 00:28:09.996 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:09.996 05:56:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:10.564 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:10.823 [2024-07-26 05:56:25.578360] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:10.823 [2024-07-26 05:56:25.578395] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:10.823 00:28:10.823 Latency(us) 00:28:10.823 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:10.823 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:28:10.823 raid_bdev1 : 11.36 93.42 280.25 0.00 0.00 14497.70 292.06 113063.85 00:28:10.823 =================================================================================================================== 00:28:10.823 Total : 93.42 280.25 0.00 0.00 14497.70 292.06 113063.85 00:28:10.823 [2024-07-26 05:56:25.678540] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:10.823 [2024-07-26 05:56:25.678569] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:10.823 [2024-07-26 05:56:25.678676] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:10.823 [2024-07-26 05:56:25.678689] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe708a0 name raid_bdev1, state offline 00:28:10.823 0 00:28:10.823 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:28:10.823 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:11.082 05:56:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:28:11.341 /dev/nbd0 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:11.341 1+0 records in 00:28:11.341 1+0 records out 00:28:11.341 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268523 s, 15.3 MB/s 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:11.341 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:28:11.600 /dev/nbd1 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:11.600 1+0 records in 00:28:11.600 1+0 records out 00:28:11.600 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223176 s, 18.4 MB/s 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:11.600 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:28:11.601 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:11.601 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:11.601 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:28:11.601 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:11.601 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:11.601 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:11.859 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:28:11.859 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:11.859 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:28:11.859 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:11.859 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:28:11.859 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:11.860 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:12.118 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:12.118 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:12.118 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:12.118 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:12.118 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:12.118 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:12.118 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:28:12.118 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:12.119 05:56:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:28:12.119 /dev/nbd1 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:12.119 1+0 records in 00:28:12.119 1+0 records out 00:28:12.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000222326 s, 18.4 MB/s 00:28:12.119 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:12.378 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:12.637 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:12.897 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:12.897 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:12.897 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:12.897 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:12.897 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:12.897 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:28:12.897 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:12.897 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:12.897 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:13.156 05:56:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:13.415 [2024-07-26 05:56:28.065305] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:13.415 [2024-07-26 05:56:28.065350] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:13.415 [2024-07-26 05:56:28.065372] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe70ec0 00:28:13.415 [2024-07-26 05:56:28.065391] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:13.415 [2024-07-26 05:56:28.066999] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:13.415 [2024-07-26 05:56:28.067025] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:13.415 [2024-07-26 05:56:28.067102] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:13.415 [2024-07-26 05:56:28.067129] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:13.415 [2024-07-26 05:56:28.067233] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:28:13.415 [2024-07-26 05:56:28.067308] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:28:13.415 spare 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.415 [2024-07-26 05:56:28.167625] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0xeea4d0 00:28:13.415 [2024-07-26 05:56:28.167643] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:13.415 [2024-07-26 05:56:28.167828] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd7090 00:28:13.415 [2024-07-26 05:56:28.167972] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xeea4d0 00:28:13.415 [2024-07-26 05:56:28.167982] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xeea4d0 00:28:13.415 [2024-07-26 05:56:28.168085] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:13.415 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:13.415 "name": "raid_bdev1", 00:28:13.415 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:13.415 "strip_size_kb": 0, 00:28:13.415 "state": "online", 00:28:13.415 "raid_level": "raid1", 00:28:13.415 "superblock": true, 00:28:13.415 "num_base_bdevs": 4, 00:28:13.415 "num_base_bdevs_discovered": 3, 00:28:13.415 "num_base_bdevs_operational": 3, 00:28:13.415 "base_bdevs_list": [ 00:28:13.415 { 00:28:13.415 "name": "spare", 00:28:13.415 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:13.415 "is_configured": true, 00:28:13.415 "data_offset": 2048, 00:28:13.415 "data_size": 63488 00:28:13.415 }, 00:28:13.415 { 00:28:13.415 "name": null, 00:28:13.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:13.415 "is_configured": false, 00:28:13.415 "data_offset": 2048, 00:28:13.415 "data_size": 63488 00:28:13.415 }, 00:28:13.415 { 00:28:13.415 "name": "BaseBdev3", 00:28:13.416 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:13.416 "is_configured": true, 00:28:13.416 "data_offset": 2048, 00:28:13.416 "data_size": 63488 00:28:13.416 }, 00:28:13.416 { 00:28:13.416 "name": "BaseBdev4", 00:28:13.416 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:13.416 "is_configured": true, 00:28:13.416 "data_offset": 2048, 00:28:13.416 "data_size": 63488 00:28:13.416 } 00:28:13.416 ] 00:28:13.416 }' 00:28:13.416 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:13.416 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:13.982 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:13.982 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:13.982 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:13.982 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:13.982 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:13.982 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.982 05:56:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.241 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:14.241 "name": "raid_bdev1", 00:28:14.241 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:14.241 "strip_size_kb": 0, 00:28:14.241 "state": "online", 00:28:14.241 "raid_level": "raid1", 00:28:14.241 "superblock": true, 00:28:14.241 "num_base_bdevs": 4, 00:28:14.241 "num_base_bdevs_discovered": 3, 00:28:14.241 "num_base_bdevs_operational": 3, 00:28:14.241 "base_bdevs_list": [ 00:28:14.241 { 00:28:14.241 "name": "spare", 00:28:14.241 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:14.241 "is_configured": true, 00:28:14.241 "data_offset": 2048, 00:28:14.241 "data_size": 63488 00:28:14.241 }, 00:28:14.241 { 00:28:14.241 "name": null, 00:28:14.241 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:14.241 "is_configured": false, 00:28:14.241 "data_offset": 2048, 00:28:14.241 "data_size": 63488 00:28:14.241 }, 00:28:14.241 { 00:28:14.241 "name": "BaseBdev3", 00:28:14.241 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:14.241 "is_configured": true, 00:28:14.241 "data_offset": 2048, 00:28:14.241 "data_size": 63488 00:28:14.241 }, 00:28:14.241 { 00:28:14.241 "name": "BaseBdev4", 00:28:14.241 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:14.241 "is_configured": true, 00:28:14.241 "data_offset": 2048, 00:28:14.241 "data_size": 63488 00:28:14.241 } 00:28:14.241 ] 00:28:14.241 }' 00:28:14.241 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:14.500 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:14.500 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:14.500 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:14.500 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:14.500 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.759 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:14.759 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:15.018 [2024-07-26 05:56:29.673904] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.018 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.277 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:15.277 "name": "raid_bdev1", 00:28:15.277 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:15.277 "strip_size_kb": 0, 00:28:15.277 "state": "online", 00:28:15.277 "raid_level": "raid1", 00:28:15.277 "superblock": true, 00:28:15.277 "num_base_bdevs": 4, 00:28:15.277 "num_base_bdevs_discovered": 2, 00:28:15.277 "num_base_bdevs_operational": 2, 00:28:15.277 "base_bdevs_list": [ 00:28:15.277 { 00:28:15.277 "name": null, 00:28:15.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.277 "is_configured": false, 00:28:15.277 "data_offset": 2048, 00:28:15.277 "data_size": 63488 00:28:15.277 }, 00:28:15.277 { 00:28:15.277 "name": null, 00:28:15.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.277 "is_configured": false, 00:28:15.277 "data_offset": 2048, 00:28:15.277 "data_size": 63488 00:28:15.277 }, 00:28:15.277 { 00:28:15.277 "name": "BaseBdev3", 00:28:15.277 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:15.277 "is_configured": true, 00:28:15.277 "data_offset": 2048, 00:28:15.277 "data_size": 63488 00:28:15.277 }, 00:28:15.277 { 00:28:15.277 "name": "BaseBdev4", 00:28:15.277 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:15.277 "is_configured": true, 00:28:15.277 "data_offset": 2048, 00:28:15.277 "data_size": 63488 00:28:15.277 } 00:28:15.277 ] 00:28:15.277 }' 00:28:15.277 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:15.277 05:56:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:15.844 05:56:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:16.103 [2024-07-26 05:56:30.756924] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:16.103 [2024-07-26 05:56:30.757075] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:28:16.103 [2024-07-26 05:56:30.757091] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:16.103 [2024-07-26 05:56:30.757120] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:16.103 [2024-07-26 05:56:30.761585] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa46fa0 00:28:16.103 [2024-07-26 05:56:30.763931] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:16.103 05:56:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:17.039 05:56:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:17.039 05:56:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:17.039 05:56:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:17.039 05:56:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:17.039 05:56:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:17.039 05:56:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.039 05:56:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.298 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:17.298 "name": "raid_bdev1", 00:28:17.298 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:17.298 "strip_size_kb": 0, 00:28:17.298 "state": "online", 00:28:17.298 "raid_level": "raid1", 00:28:17.298 "superblock": true, 00:28:17.298 "num_base_bdevs": 4, 00:28:17.298 "num_base_bdevs_discovered": 3, 00:28:17.298 "num_base_bdevs_operational": 3, 00:28:17.298 "process": { 00:28:17.298 "type": "rebuild", 00:28:17.298 "target": "spare", 00:28:17.298 "progress": { 00:28:17.298 "blocks": 24576, 00:28:17.298 "percent": 38 00:28:17.298 } 00:28:17.298 }, 00:28:17.298 "base_bdevs_list": [ 00:28:17.298 { 00:28:17.298 "name": "spare", 00:28:17.298 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:17.298 "is_configured": true, 00:28:17.298 "data_offset": 2048, 00:28:17.298 "data_size": 63488 00:28:17.298 }, 00:28:17.298 { 00:28:17.298 "name": null, 00:28:17.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.298 "is_configured": false, 00:28:17.298 "data_offset": 2048, 00:28:17.298 "data_size": 63488 00:28:17.298 }, 00:28:17.298 { 00:28:17.298 "name": "BaseBdev3", 00:28:17.298 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:17.298 "is_configured": true, 00:28:17.298 "data_offset": 2048, 00:28:17.298 "data_size": 63488 00:28:17.298 }, 00:28:17.298 { 00:28:17.298 "name": "BaseBdev4", 00:28:17.298 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:17.298 "is_configured": true, 00:28:17.298 "data_offset": 2048, 00:28:17.298 "data_size": 63488 00:28:17.298 } 00:28:17.298 ] 00:28:17.298 }' 00:28:17.298 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:17.298 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:17.298 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:17.298 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:17.298 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:17.557 [2024-07-26 05:56:32.347417] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:17.557 [2024-07-26 05:56:32.376881] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:17.557 [2024-07-26 05:56:32.376925] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:17.557 [2024-07-26 05:56:32.376941] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:17.557 [2024-07-26 05:56:32.376950] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:17.557 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:17.557 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:17.557 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:17.557 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:17.557 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:17.557 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:17.557 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:17.557 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:17.558 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:17.558 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:17.558 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.558 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.816 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:17.816 "name": "raid_bdev1", 00:28:17.816 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:17.816 "strip_size_kb": 0, 00:28:17.816 "state": "online", 00:28:17.816 "raid_level": "raid1", 00:28:17.816 "superblock": true, 00:28:17.816 "num_base_bdevs": 4, 00:28:17.816 "num_base_bdevs_discovered": 2, 00:28:17.816 "num_base_bdevs_operational": 2, 00:28:17.816 "base_bdevs_list": [ 00:28:17.816 { 00:28:17.816 "name": null, 00:28:17.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.816 "is_configured": false, 00:28:17.816 "data_offset": 2048, 00:28:17.816 "data_size": 63488 00:28:17.816 }, 00:28:17.816 { 00:28:17.816 "name": null, 00:28:17.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.817 "is_configured": false, 00:28:17.817 "data_offset": 2048, 00:28:17.817 "data_size": 63488 00:28:17.817 }, 00:28:17.817 { 00:28:17.817 "name": "BaseBdev3", 00:28:17.817 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:17.817 "is_configured": true, 00:28:17.817 "data_offset": 2048, 00:28:17.817 "data_size": 63488 00:28:17.817 }, 00:28:17.817 { 00:28:17.817 "name": "BaseBdev4", 00:28:17.817 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:17.817 "is_configured": true, 00:28:17.817 "data_offset": 2048, 00:28:17.817 "data_size": 63488 00:28:17.817 } 00:28:17.817 ] 00:28:17.817 }' 00:28:17.817 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:17.817 05:56:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:18.384 05:56:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:18.644 [2024-07-26 05:56:33.461050] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:18.644 [2024-07-26 05:56:33.461100] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:18.644 [2024-07-26 05:56:33.461123] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdd4450 00:28:18.644 [2024-07-26 05:56:33.461135] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:18.644 [2024-07-26 05:56:33.461501] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:18.644 [2024-07-26 05:56:33.461519] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:18.644 [2024-07-26 05:56:33.461601] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:18.644 [2024-07-26 05:56:33.461613] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:28:18.644 [2024-07-26 05:56:33.461623] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:18.644 [2024-07-26 05:56:33.461680] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:18.644 [2024-07-26 05:56:33.466165] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdd7090 00:28:18.644 spare 00:28:18.644 [2024-07-26 05:56:33.467647] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:18.644 05:56:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:19.579 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:19.579 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:19.579 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:19.579 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:19.579 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:19.838 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.838 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.838 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:19.838 "name": "raid_bdev1", 00:28:19.838 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:19.838 "strip_size_kb": 0, 00:28:19.838 "state": "online", 00:28:19.838 "raid_level": "raid1", 00:28:19.838 "superblock": true, 00:28:19.838 "num_base_bdevs": 4, 00:28:19.838 "num_base_bdevs_discovered": 3, 00:28:19.838 "num_base_bdevs_operational": 3, 00:28:19.838 "process": { 00:28:19.838 "type": "rebuild", 00:28:19.838 "target": "spare", 00:28:19.838 "progress": { 00:28:19.838 "blocks": 24576, 00:28:19.838 "percent": 38 00:28:19.838 } 00:28:19.838 }, 00:28:19.838 "base_bdevs_list": [ 00:28:19.838 { 00:28:19.838 "name": "spare", 00:28:19.838 "uuid": "643b1f0c-111d-51ee-acc5-d9a0cd956312", 00:28:19.838 "is_configured": true, 00:28:19.838 "data_offset": 2048, 00:28:19.838 "data_size": 63488 00:28:19.838 }, 00:28:19.838 { 00:28:19.838 "name": null, 00:28:19.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:19.838 "is_configured": false, 00:28:19.838 "data_offset": 2048, 00:28:19.838 "data_size": 63488 00:28:19.838 }, 00:28:19.838 { 00:28:19.838 "name": "BaseBdev3", 00:28:19.838 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:19.838 "is_configured": true, 00:28:19.838 "data_offset": 2048, 00:28:19.838 "data_size": 63488 00:28:19.838 }, 00:28:19.838 { 00:28:19.838 "name": "BaseBdev4", 00:28:19.838 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:19.838 "is_configured": true, 00:28:19.838 "data_offset": 2048, 00:28:19.838 "data_size": 63488 00:28:19.838 } 00:28:19.838 ] 00:28:19.838 }' 00:28:19.838 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:20.097 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:20.097 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:20.097 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:20.097 05:56:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:20.357 [2024-07-26 05:56:35.047038] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:20.357 [2024-07-26 05:56:35.080440] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:20.357 [2024-07-26 05:56:35.080484] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:20.357 [2024-07-26 05:56:35.080500] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:20.357 [2024-07-26 05:56:35.080508] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.357 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.616 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:20.616 "name": "raid_bdev1", 00:28:20.616 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:20.616 "strip_size_kb": 0, 00:28:20.616 "state": "online", 00:28:20.616 "raid_level": "raid1", 00:28:20.616 "superblock": true, 00:28:20.616 "num_base_bdevs": 4, 00:28:20.616 "num_base_bdevs_discovered": 2, 00:28:20.616 "num_base_bdevs_operational": 2, 00:28:20.616 "base_bdevs_list": [ 00:28:20.616 { 00:28:20.616 "name": null, 00:28:20.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.616 "is_configured": false, 00:28:20.616 "data_offset": 2048, 00:28:20.616 "data_size": 63488 00:28:20.616 }, 00:28:20.616 { 00:28:20.616 "name": null, 00:28:20.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.616 "is_configured": false, 00:28:20.616 "data_offset": 2048, 00:28:20.616 "data_size": 63488 00:28:20.616 }, 00:28:20.616 { 00:28:20.616 "name": "BaseBdev3", 00:28:20.616 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:20.616 "is_configured": true, 00:28:20.616 "data_offset": 2048, 00:28:20.616 "data_size": 63488 00:28:20.616 }, 00:28:20.616 { 00:28:20.616 "name": "BaseBdev4", 00:28:20.616 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:20.616 "is_configured": true, 00:28:20.616 "data_offset": 2048, 00:28:20.616 "data_size": 63488 00:28:20.616 } 00:28:20.616 ] 00:28:20.616 }' 00:28:20.616 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:20.616 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:21.183 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:21.183 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:21.183 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:21.183 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:21.183 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:21.183 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.183 05:56:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.442 05:56:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:21.442 "name": "raid_bdev1", 00:28:21.442 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:21.442 "strip_size_kb": 0, 00:28:21.442 "state": "online", 00:28:21.442 "raid_level": "raid1", 00:28:21.442 "superblock": true, 00:28:21.442 "num_base_bdevs": 4, 00:28:21.442 "num_base_bdevs_discovered": 2, 00:28:21.442 "num_base_bdevs_operational": 2, 00:28:21.442 "base_bdevs_list": [ 00:28:21.442 { 00:28:21.442 "name": null, 00:28:21.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.442 "is_configured": false, 00:28:21.442 "data_offset": 2048, 00:28:21.442 "data_size": 63488 00:28:21.442 }, 00:28:21.442 { 00:28:21.442 "name": null, 00:28:21.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.442 "is_configured": false, 00:28:21.442 "data_offset": 2048, 00:28:21.442 "data_size": 63488 00:28:21.442 }, 00:28:21.442 { 00:28:21.442 "name": "BaseBdev3", 00:28:21.442 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:21.442 "is_configured": true, 00:28:21.442 "data_offset": 2048, 00:28:21.442 "data_size": 63488 00:28:21.442 }, 00:28:21.442 { 00:28:21.442 "name": "BaseBdev4", 00:28:21.442 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:21.442 "is_configured": true, 00:28:21.442 "data_offset": 2048, 00:28:21.442 "data_size": 63488 00:28:21.442 } 00:28:21.442 ] 00:28:21.442 }' 00:28:21.442 05:56:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:21.442 05:56:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:21.442 05:56:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:21.442 05:56:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:21.442 05:56:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:21.701 05:56:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:21.701 [2024-07-26 05:56:36.584869] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:21.701 [2024-07-26 05:56:36.584912] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:21.701 [2024-07-26 05:56:36.584932] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xee86b0 00:28:21.701 [2024-07-26 05:56:36.584944] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:21.701 [2024-07-26 05:56:36.585283] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:21.701 [2024-07-26 05:56:36.585301] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:21.701 [2024-07-26 05:56:36.585361] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:21.701 [2024-07-26 05:56:36.585373] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:28:21.701 [2024-07-26 05:56:36.585383] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:21.701 BaseBdev1 00:28:21.701 05:56:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.081 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.082 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.082 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.082 "name": "raid_bdev1", 00:28:23.082 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:23.082 "strip_size_kb": 0, 00:28:23.082 "state": "online", 00:28:23.082 "raid_level": "raid1", 00:28:23.082 "superblock": true, 00:28:23.082 "num_base_bdevs": 4, 00:28:23.082 "num_base_bdevs_discovered": 2, 00:28:23.082 "num_base_bdevs_operational": 2, 00:28:23.082 "base_bdevs_list": [ 00:28:23.082 { 00:28:23.082 "name": null, 00:28:23.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.082 "is_configured": false, 00:28:23.082 "data_offset": 2048, 00:28:23.082 "data_size": 63488 00:28:23.082 }, 00:28:23.082 { 00:28:23.082 "name": null, 00:28:23.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.082 "is_configured": false, 00:28:23.082 "data_offset": 2048, 00:28:23.082 "data_size": 63488 00:28:23.082 }, 00:28:23.082 { 00:28:23.082 "name": "BaseBdev3", 00:28:23.082 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:23.082 "is_configured": true, 00:28:23.082 "data_offset": 2048, 00:28:23.082 "data_size": 63488 00:28:23.082 }, 00:28:23.082 { 00:28:23.082 "name": "BaseBdev4", 00:28:23.082 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:23.082 "is_configured": true, 00:28:23.082 "data_offset": 2048, 00:28:23.082 "data_size": 63488 00:28:23.082 } 00:28:23.082 ] 00:28:23.082 }' 00:28:23.082 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.082 05:56:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:23.650 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:23.650 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:23.650 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:23.650 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:23.650 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:23.650 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.650 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:23.908 "name": "raid_bdev1", 00:28:23.908 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:23.908 "strip_size_kb": 0, 00:28:23.908 "state": "online", 00:28:23.908 "raid_level": "raid1", 00:28:23.908 "superblock": true, 00:28:23.908 "num_base_bdevs": 4, 00:28:23.908 "num_base_bdevs_discovered": 2, 00:28:23.908 "num_base_bdevs_operational": 2, 00:28:23.908 "base_bdevs_list": [ 00:28:23.908 { 00:28:23.908 "name": null, 00:28:23.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.908 "is_configured": false, 00:28:23.908 "data_offset": 2048, 00:28:23.908 "data_size": 63488 00:28:23.908 }, 00:28:23.908 { 00:28:23.908 "name": null, 00:28:23.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.908 "is_configured": false, 00:28:23.908 "data_offset": 2048, 00:28:23.908 "data_size": 63488 00:28:23.908 }, 00:28:23.908 { 00:28:23.908 "name": "BaseBdev3", 00:28:23.908 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:23.908 "is_configured": true, 00:28:23.908 "data_offset": 2048, 00:28:23.908 "data_size": 63488 00:28:23.908 }, 00:28:23.908 { 00:28:23.908 "name": "BaseBdev4", 00:28:23.908 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:23.908 "is_configured": true, 00:28:23.908 "data_offset": 2048, 00:28:23.908 "data_size": 63488 00:28:23.908 } 00:28:23.908 ] 00:28:23.908 }' 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:23.908 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:23.909 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:23.909 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:24.167 [2024-07-26 05:56:38.967595] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:24.167 [2024-07-26 05:56:38.967728] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:28:24.167 [2024-07-26 05:56:38.967744] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:24.167 request: 00:28:24.167 { 00:28:24.167 "base_bdev": "BaseBdev1", 00:28:24.167 "raid_bdev": "raid_bdev1", 00:28:24.167 "method": "bdev_raid_add_base_bdev", 00:28:24.167 "req_id": 1 00:28:24.167 } 00:28:24.167 Got JSON-RPC error response 00:28:24.167 response: 00:28:24.167 { 00:28:24.167 "code": -22, 00:28:24.167 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:24.167 } 00:28:24.167 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:28:24.167 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:24.167 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:24.167 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:24.167 05:56:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.164 05:56:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:25.423 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:25.423 "name": "raid_bdev1", 00:28:25.423 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:25.423 "strip_size_kb": 0, 00:28:25.423 "state": "online", 00:28:25.423 "raid_level": "raid1", 00:28:25.423 "superblock": true, 00:28:25.423 "num_base_bdevs": 4, 00:28:25.423 "num_base_bdevs_discovered": 2, 00:28:25.423 "num_base_bdevs_operational": 2, 00:28:25.423 "base_bdevs_list": [ 00:28:25.423 { 00:28:25.423 "name": null, 00:28:25.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:25.424 "is_configured": false, 00:28:25.424 "data_offset": 2048, 00:28:25.424 "data_size": 63488 00:28:25.424 }, 00:28:25.424 { 00:28:25.424 "name": null, 00:28:25.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:25.424 "is_configured": false, 00:28:25.424 "data_offset": 2048, 00:28:25.424 "data_size": 63488 00:28:25.424 }, 00:28:25.424 { 00:28:25.424 "name": "BaseBdev3", 00:28:25.424 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:25.424 "is_configured": true, 00:28:25.424 "data_offset": 2048, 00:28:25.424 "data_size": 63488 00:28:25.424 }, 00:28:25.424 { 00:28:25.424 "name": "BaseBdev4", 00:28:25.424 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:25.424 "is_configured": true, 00:28:25.424 "data_offset": 2048, 00:28:25.424 "data_size": 63488 00:28:25.424 } 00:28:25.424 ] 00:28:25.424 }' 00:28:25.424 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:25.424 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:25.990 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:25.990 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:25.990 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:25.990 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:25.990 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:25.990 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:25.990 05:56:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:26.249 "name": "raid_bdev1", 00:28:26.249 "uuid": "14510be2-7f77-4a86-b908-edb3b7897ce5", 00:28:26.249 "strip_size_kb": 0, 00:28:26.249 "state": "online", 00:28:26.249 "raid_level": "raid1", 00:28:26.249 "superblock": true, 00:28:26.249 "num_base_bdevs": 4, 00:28:26.249 "num_base_bdevs_discovered": 2, 00:28:26.249 "num_base_bdevs_operational": 2, 00:28:26.249 "base_bdevs_list": [ 00:28:26.249 { 00:28:26.249 "name": null, 00:28:26.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.249 "is_configured": false, 00:28:26.249 "data_offset": 2048, 00:28:26.249 "data_size": 63488 00:28:26.249 }, 00:28:26.249 { 00:28:26.249 "name": null, 00:28:26.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.249 "is_configured": false, 00:28:26.249 "data_offset": 2048, 00:28:26.249 "data_size": 63488 00:28:26.249 }, 00:28:26.249 { 00:28:26.249 "name": "BaseBdev3", 00:28:26.249 "uuid": "22161417-e3db-55eb-927d-f7ebd6d2e5b3", 00:28:26.249 "is_configured": true, 00:28:26.249 "data_offset": 2048, 00:28:26.249 "data_size": 63488 00:28:26.249 }, 00:28:26.249 { 00:28:26.249 "name": "BaseBdev4", 00:28:26.249 "uuid": "e065b845-b65f-58aa-abf9-4214e2c3bd4d", 00:28:26.249 "is_configured": true, 00:28:26.249 "data_offset": 2048, 00:28:26.249 "data_size": 63488 00:28:26.249 } 00:28:26.249 ] 00:28:26.249 }' 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1260580 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1260580 ']' 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1260580 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1260580 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1260580' 00:28:26.249 killing process with pid 1260580 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1260580 00:28:26.249 Received shutdown signal, test time was about 26.797936 seconds 00:28:26.249 00:28:26.249 Latency(us) 00:28:26.249 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:26.249 =================================================================================================================== 00:28:26.249 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:26.249 [2024-07-26 05:56:41.152260] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:26.249 [2024-07-26 05:56:41.152368] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:26.249 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1260580 00:28:26.249 [2024-07-26 05:56:41.152438] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:26.249 [2024-07-26 05:56:41.152454] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeea4d0 name raid_bdev1, state offline 00:28:26.508 [2024-07-26 05:56:41.201710] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:26.767 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:28:26.767 00:28:26.767 real 0m32.781s 00:28:26.767 user 0m51.686s 00:28:26.767 sys 0m5.156s 00:28:26.767 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:26.767 05:56:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:26.767 ************************************ 00:28:26.767 END TEST raid_rebuild_test_sb_io 00:28:26.767 ************************************ 00:28:26.767 05:56:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:26.767 05:56:41 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:28:26.767 05:56:41 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:28:26.767 05:56:41 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:28:26.767 05:56:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:26.767 05:56:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:26.767 05:56:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:26.767 ************************************ 00:28:26.767 START TEST raid_state_function_test_sb_4k 00:28:26.767 ************************************ 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1265267 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1265267' 00:28:26.767 Process raid pid: 1265267 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1265267 /var/tmp/spdk-raid.sock 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1265267 ']' 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:26.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:26.767 05:56:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:26.767 [2024-07-26 05:56:41.578989] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:28:26.767 [2024-07-26 05:56:41.579051] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:27.026 [2024-07-26 05:56:41.710060] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:27.026 [2024-07-26 05:56:41.816148] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:27.026 [2024-07-26 05:56:41.875752] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:27.026 [2024-07-26 05:56:41.875786] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:27.961 [2024-07-26 05:56:42.752406] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:27.961 [2024-07-26 05:56:42.752446] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:27.961 [2024-07-26 05:56:42.752456] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:27.961 [2024-07-26 05:56:42.752468] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.961 05:56:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:28.219 05:56:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:28.219 "name": "Existed_Raid", 00:28:28.219 "uuid": "008ea37b-f568-4b12-aa99-abb5ba32d4f9", 00:28:28.219 "strip_size_kb": 0, 00:28:28.219 "state": "configuring", 00:28:28.219 "raid_level": "raid1", 00:28:28.219 "superblock": true, 00:28:28.219 "num_base_bdevs": 2, 00:28:28.219 "num_base_bdevs_discovered": 0, 00:28:28.219 "num_base_bdevs_operational": 2, 00:28:28.219 "base_bdevs_list": [ 00:28:28.219 { 00:28:28.219 "name": "BaseBdev1", 00:28:28.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.219 "is_configured": false, 00:28:28.219 "data_offset": 0, 00:28:28.219 "data_size": 0 00:28:28.219 }, 00:28:28.219 { 00:28:28.220 "name": "BaseBdev2", 00:28:28.220 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.220 "is_configured": false, 00:28:28.220 "data_offset": 0, 00:28:28.220 "data_size": 0 00:28:28.220 } 00:28:28.220 ] 00:28:28.220 }' 00:28:28.220 05:56:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:28.220 05:56:43 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:28.784 05:56:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:29.042 [2024-07-26 05:56:43.827123] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:29.042 [2024-07-26 05:56:43.827152] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x241ba80 name Existed_Raid, state configuring 00:28:29.042 05:56:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:29.299 [2024-07-26 05:56:44.071786] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:29.299 [2024-07-26 05:56:44.071817] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:29.299 [2024-07-26 05:56:44.071827] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:29.299 [2024-07-26 05:56:44.071838] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:29.299 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:28:29.556 [2024-07-26 05:56:44.322431] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:29.556 BaseBdev1 00:28:29.556 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:29.556 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:29.556 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:29.556 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:28:29.556 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:29.556 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:29.556 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:29.814 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:30.073 [ 00:28:30.073 { 00:28:30.073 "name": "BaseBdev1", 00:28:30.073 "aliases": [ 00:28:30.073 "ab7dae6f-c9f7-42f7-bcec-aecee00cf7be" 00:28:30.073 ], 00:28:30.073 "product_name": "Malloc disk", 00:28:30.073 "block_size": 4096, 00:28:30.073 "num_blocks": 8192, 00:28:30.073 "uuid": "ab7dae6f-c9f7-42f7-bcec-aecee00cf7be", 00:28:30.073 "assigned_rate_limits": { 00:28:30.073 "rw_ios_per_sec": 0, 00:28:30.073 "rw_mbytes_per_sec": 0, 00:28:30.073 "r_mbytes_per_sec": 0, 00:28:30.073 "w_mbytes_per_sec": 0 00:28:30.073 }, 00:28:30.073 "claimed": true, 00:28:30.073 "claim_type": "exclusive_write", 00:28:30.073 "zoned": false, 00:28:30.073 "supported_io_types": { 00:28:30.073 "read": true, 00:28:30.073 "write": true, 00:28:30.073 "unmap": true, 00:28:30.073 "flush": true, 00:28:30.073 "reset": true, 00:28:30.073 "nvme_admin": false, 00:28:30.073 "nvme_io": false, 00:28:30.073 "nvme_io_md": false, 00:28:30.073 "write_zeroes": true, 00:28:30.073 "zcopy": true, 00:28:30.073 "get_zone_info": false, 00:28:30.073 "zone_management": false, 00:28:30.073 "zone_append": false, 00:28:30.073 "compare": false, 00:28:30.073 "compare_and_write": false, 00:28:30.073 "abort": true, 00:28:30.073 "seek_hole": false, 00:28:30.073 "seek_data": false, 00:28:30.073 "copy": true, 00:28:30.073 "nvme_iov_md": false 00:28:30.073 }, 00:28:30.073 "memory_domains": [ 00:28:30.073 { 00:28:30.073 "dma_device_id": "system", 00:28:30.073 "dma_device_type": 1 00:28:30.073 }, 00:28:30.073 { 00:28:30.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:30.073 "dma_device_type": 2 00:28:30.073 } 00:28:30.073 ], 00:28:30.073 "driver_specific": {} 00:28:30.073 } 00:28:30.073 ] 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.073 05:56:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:30.331 05:56:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:30.331 "name": "Existed_Raid", 00:28:30.331 "uuid": "3826aa47-5d73-473f-a956-3280b1c3773b", 00:28:30.331 "strip_size_kb": 0, 00:28:30.331 "state": "configuring", 00:28:30.331 "raid_level": "raid1", 00:28:30.331 "superblock": true, 00:28:30.331 "num_base_bdevs": 2, 00:28:30.331 "num_base_bdevs_discovered": 1, 00:28:30.331 "num_base_bdevs_operational": 2, 00:28:30.331 "base_bdevs_list": [ 00:28:30.331 { 00:28:30.331 "name": "BaseBdev1", 00:28:30.331 "uuid": "ab7dae6f-c9f7-42f7-bcec-aecee00cf7be", 00:28:30.331 "is_configured": true, 00:28:30.331 "data_offset": 256, 00:28:30.331 "data_size": 7936 00:28:30.331 }, 00:28:30.331 { 00:28:30.331 "name": "BaseBdev2", 00:28:30.331 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:30.331 "is_configured": false, 00:28:30.331 "data_offset": 0, 00:28:30.331 "data_size": 0 00:28:30.331 } 00:28:30.331 ] 00:28:30.331 }' 00:28:30.331 05:56:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:30.331 05:56:45 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:30.897 05:56:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:31.155 [2024-07-26 05:56:45.934872] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:31.155 [2024-07-26 05:56:45.934914] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x241b350 name Existed_Raid, state configuring 00:28:31.155 05:56:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:31.413 [2024-07-26 05:56:46.175538] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:31.413 [2024-07-26 05:56:46.177048] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:31.413 [2024-07-26 05:56:46.177081] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.413 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:31.671 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.671 "name": "Existed_Raid", 00:28:31.671 "uuid": "ed81174e-dca0-43be-b17b-13e48aee3f8e", 00:28:31.671 "strip_size_kb": 0, 00:28:31.671 "state": "configuring", 00:28:31.671 "raid_level": "raid1", 00:28:31.671 "superblock": true, 00:28:31.671 "num_base_bdevs": 2, 00:28:31.671 "num_base_bdevs_discovered": 1, 00:28:31.671 "num_base_bdevs_operational": 2, 00:28:31.671 "base_bdevs_list": [ 00:28:31.671 { 00:28:31.671 "name": "BaseBdev1", 00:28:31.671 "uuid": "ab7dae6f-c9f7-42f7-bcec-aecee00cf7be", 00:28:31.671 "is_configured": true, 00:28:31.671 "data_offset": 256, 00:28:31.671 "data_size": 7936 00:28:31.671 }, 00:28:31.671 { 00:28:31.671 "name": "BaseBdev2", 00:28:31.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.671 "is_configured": false, 00:28:31.671 "data_offset": 0, 00:28:31.671 "data_size": 0 00:28:31.671 } 00:28:31.671 ] 00:28:31.671 }' 00:28:31.671 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.671 05:56:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:32.237 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:28:32.495 [2024-07-26 05:56:47.289977] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:32.495 [2024-07-26 05:56:47.290129] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x241c000 00:28:32.495 [2024-07-26 05:56:47.290143] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:32.495 [2024-07-26 05:56:47.290314] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23360c0 00:28:32.495 [2024-07-26 05:56:47.290435] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x241c000 00:28:32.495 [2024-07-26 05:56:47.290445] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x241c000 00:28:32.495 [2024-07-26 05:56:47.290534] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:32.495 BaseBdev2 00:28:32.495 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:32.495 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:32.495 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:32.495 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:28:32.495 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:32.495 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:32.495 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:32.753 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:33.011 [ 00:28:33.011 { 00:28:33.011 "name": "BaseBdev2", 00:28:33.011 "aliases": [ 00:28:33.011 "19c5414e-9c74-4540-9a2c-81a767546169" 00:28:33.011 ], 00:28:33.011 "product_name": "Malloc disk", 00:28:33.011 "block_size": 4096, 00:28:33.011 "num_blocks": 8192, 00:28:33.011 "uuid": "19c5414e-9c74-4540-9a2c-81a767546169", 00:28:33.011 "assigned_rate_limits": { 00:28:33.011 "rw_ios_per_sec": 0, 00:28:33.011 "rw_mbytes_per_sec": 0, 00:28:33.011 "r_mbytes_per_sec": 0, 00:28:33.011 "w_mbytes_per_sec": 0 00:28:33.011 }, 00:28:33.011 "claimed": true, 00:28:33.011 "claim_type": "exclusive_write", 00:28:33.011 "zoned": false, 00:28:33.012 "supported_io_types": { 00:28:33.012 "read": true, 00:28:33.012 "write": true, 00:28:33.012 "unmap": true, 00:28:33.012 "flush": true, 00:28:33.012 "reset": true, 00:28:33.012 "nvme_admin": false, 00:28:33.012 "nvme_io": false, 00:28:33.012 "nvme_io_md": false, 00:28:33.012 "write_zeroes": true, 00:28:33.012 "zcopy": true, 00:28:33.012 "get_zone_info": false, 00:28:33.012 "zone_management": false, 00:28:33.012 "zone_append": false, 00:28:33.012 "compare": false, 00:28:33.012 "compare_and_write": false, 00:28:33.012 "abort": true, 00:28:33.012 "seek_hole": false, 00:28:33.012 "seek_data": false, 00:28:33.012 "copy": true, 00:28:33.012 "nvme_iov_md": false 00:28:33.012 }, 00:28:33.012 "memory_domains": [ 00:28:33.012 { 00:28:33.012 "dma_device_id": "system", 00:28:33.012 "dma_device_type": 1 00:28:33.012 }, 00:28:33.012 { 00:28:33.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:33.012 "dma_device_type": 2 00:28:33.012 } 00:28:33.012 ], 00:28:33.012 "driver_specific": {} 00:28:33.012 } 00:28:33.012 ] 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:33.012 05:56:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:33.270 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:33.270 "name": "Existed_Raid", 00:28:33.270 "uuid": "ed81174e-dca0-43be-b17b-13e48aee3f8e", 00:28:33.270 "strip_size_kb": 0, 00:28:33.270 "state": "online", 00:28:33.270 "raid_level": "raid1", 00:28:33.270 "superblock": true, 00:28:33.270 "num_base_bdevs": 2, 00:28:33.270 "num_base_bdevs_discovered": 2, 00:28:33.270 "num_base_bdevs_operational": 2, 00:28:33.270 "base_bdevs_list": [ 00:28:33.270 { 00:28:33.270 "name": "BaseBdev1", 00:28:33.270 "uuid": "ab7dae6f-c9f7-42f7-bcec-aecee00cf7be", 00:28:33.270 "is_configured": true, 00:28:33.270 "data_offset": 256, 00:28:33.270 "data_size": 7936 00:28:33.270 }, 00:28:33.270 { 00:28:33.270 "name": "BaseBdev2", 00:28:33.270 "uuid": "19c5414e-9c74-4540-9a2c-81a767546169", 00:28:33.270 "is_configured": true, 00:28:33.270 "data_offset": 256, 00:28:33.270 "data_size": 7936 00:28:33.270 } 00:28:33.270 ] 00:28:33.270 }' 00:28:33.270 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:33.270 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:33.836 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:33.836 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:33.836 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:33.836 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:33.836 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:33.836 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:33.836 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:33.836 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:34.094 [2024-07-26 05:56:48.862418] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:34.094 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:34.094 "name": "Existed_Raid", 00:28:34.094 "aliases": [ 00:28:34.094 "ed81174e-dca0-43be-b17b-13e48aee3f8e" 00:28:34.094 ], 00:28:34.094 "product_name": "Raid Volume", 00:28:34.094 "block_size": 4096, 00:28:34.094 "num_blocks": 7936, 00:28:34.094 "uuid": "ed81174e-dca0-43be-b17b-13e48aee3f8e", 00:28:34.094 "assigned_rate_limits": { 00:28:34.094 "rw_ios_per_sec": 0, 00:28:34.094 "rw_mbytes_per_sec": 0, 00:28:34.094 "r_mbytes_per_sec": 0, 00:28:34.094 "w_mbytes_per_sec": 0 00:28:34.094 }, 00:28:34.094 "claimed": false, 00:28:34.094 "zoned": false, 00:28:34.094 "supported_io_types": { 00:28:34.094 "read": true, 00:28:34.094 "write": true, 00:28:34.094 "unmap": false, 00:28:34.094 "flush": false, 00:28:34.094 "reset": true, 00:28:34.094 "nvme_admin": false, 00:28:34.094 "nvme_io": false, 00:28:34.094 "nvme_io_md": false, 00:28:34.094 "write_zeroes": true, 00:28:34.094 "zcopy": false, 00:28:34.094 "get_zone_info": false, 00:28:34.094 "zone_management": false, 00:28:34.094 "zone_append": false, 00:28:34.094 "compare": false, 00:28:34.094 "compare_and_write": false, 00:28:34.094 "abort": false, 00:28:34.094 "seek_hole": false, 00:28:34.094 "seek_data": false, 00:28:34.094 "copy": false, 00:28:34.094 "nvme_iov_md": false 00:28:34.094 }, 00:28:34.094 "memory_domains": [ 00:28:34.094 { 00:28:34.094 "dma_device_id": "system", 00:28:34.094 "dma_device_type": 1 00:28:34.094 }, 00:28:34.094 { 00:28:34.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.094 "dma_device_type": 2 00:28:34.094 }, 00:28:34.094 { 00:28:34.094 "dma_device_id": "system", 00:28:34.094 "dma_device_type": 1 00:28:34.094 }, 00:28:34.094 { 00:28:34.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.094 "dma_device_type": 2 00:28:34.094 } 00:28:34.094 ], 00:28:34.094 "driver_specific": { 00:28:34.094 "raid": { 00:28:34.094 "uuid": "ed81174e-dca0-43be-b17b-13e48aee3f8e", 00:28:34.094 "strip_size_kb": 0, 00:28:34.094 "state": "online", 00:28:34.094 "raid_level": "raid1", 00:28:34.094 "superblock": true, 00:28:34.094 "num_base_bdevs": 2, 00:28:34.094 "num_base_bdevs_discovered": 2, 00:28:34.094 "num_base_bdevs_operational": 2, 00:28:34.094 "base_bdevs_list": [ 00:28:34.094 { 00:28:34.094 "name": "BaseBdev1", 00:28:34.094 "uuid": "ab7dae6f-c9f7-42f7-bcec-aecee00cf7be", 00:28:34.094 "is_configured": true, 00:28:34.094 "data_offset": 256, 00:28:34.094 "data_size": 7936 00:28:34.094 }, 00:28:34.094 { 00:28:34.094 "name": "BaseBdev2", 00:28:34.094 "uuid": "19c5414e-9c74-4540-9a2c-81a767546169", 00:28:34.094 "is_configured": true, 00:28:34.094 "data_offset": 256, 00:28:34.094 "data_size": 7936 00:28:34.094 } 00:28:34.094 ] 00:28:34.094 } 00:28:34.094 } 00:28:34.094 }' 00:28:34.094 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:34.094 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:34.094 BaseBdev2' 00:28:34.094 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:34.094 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:34.094 05:56:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:34.353 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:34.353 "name": "BaseBdev1", 00:28:34.353 "aliases": [ 00:28:34.353 "ab7dae6f-c9f7-42f7-bcec-aecee00cf7be" 00:28:34.353 ], 00:28:34.353 "product_name": "Malloc disk", 00:28:34.353 "block_size": 4096, 00:28:34.353 "num_blocks": 8192, 00:28:34.353 "uuid": "ab7dae6f-c9f7-42f7-bcec-aecee00cf7be", 00:28:34.353 "assigned_rate_limits": { 00:28:34.353 "rw_ios_per_sec": 0, 00:28:34.353 "rw_mbytes_per_sec": 0, 00:28:34.353 "r_mbytes_per_sec": 0, 00:28:34.353 "w_mbytes_per_sec": 0 00:28:34.353 }, 00:28:34.353 "claimed": true, 00:28:34.353 "claim_type": "exclusive_write", 00:28:34.353 "zoned": false, 00:28:34.353 "supported_io_types": { 00:28:34.353 "read": true, 00:28:34.353 "write": true, 00:28:34.353 "unmap": true, 00:28:34.353 "flush": true, 00:28:34.353 "reset": true, 00:28:34.353 "nvme_admin": false, 00:28:34.353 "nvme_io": false, 00:28:34.353 "nvme_io_md": false, 00:28:34.353 "write_zeroes": true, 00:28:34.353 "zcopy": true, 00:28:34.353 "get_zone_info": false, 00:28:34.353 "zone_management": false, 00:28:34.353 "zone_append": false, 00:28:34.353 "compare": false, 00:28:34.353 "compare_and_write": false, 00:28:34.353 "abort": true, 00:28:34.353 "seek_hole": false, 00:28:34.353 "seek_data": false, 00:28:34.353 "copy": true, 00:28:34.353 "nvme_iov_md": false 00:28:34.353 }, 00:28:34.353 "memory_domains": [ 00:28:34.353 { 00:28:34.353 "dma_device_id": "system", 00:28:34.353 "dma_device_type": 1 00:28:34.353 }, 00:28:34.353 { 00:28:34.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.353 "dma_device_type": 2 00:28:34.353 } 00:28:34.353 ], 00:28:34.353 "driver_specific": {} 00:28:34.353 }' 00:28:34.353 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:34.353 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:34.611 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:34.611 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:34.611 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:34.611 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:34.611 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:34.612 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:34.612 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:34.612 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:34.612 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:34.612 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:34.612 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:34.870 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:34.870 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:34.870 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:34.870 "name": "BaseBdev2", 00:28:34.870 "aliases": [ 00:28:34.870 "19c5414e-9c74-4540-9a2c-81a767546169" 00:28:34.870 ], 00:28:34.870 "product_name": "Malloc disk", 00:28:34.870 "block_size": 4096, 00:28:34.870 "num_blocks": 8192, 00:28:34.870 "uuid": "19c5414e-9c74-4540-9a2c-81a767546169", 00:28:34.870 "assigned_rate_limits": { 00:28:34.870 "rw_ios_per_sec": 0, 00:28:34.870 "rw_mbytes_per_sec": 0, 00:28:34.870 "r_mbytes_per_sec": 0, 00:28:34.870 "w_mbytes_per_sec": 0 00:28:34.870 }, 00:28:34.870 "claimed": true, 00:28:34.870 "claim_type": "exclusive_write", 00:28:34.870 "zoned": false, 00:28:34.870 "supported_io_types": { 00:28:34.870 "read": true, 00:28:34.870 "write": true, 00:28:34.870 "unmap": true, 00:28:34.870 "flush": true, 00:28:34.870 "reset": true, 00:28:34.870 "nvme_admin": false, 00:28:34.870 "nvme_io": false, 00:28:34.870 "nvme_io_md": false, 00:28:34.870 "write_zeroes": true, 00:28:34.870 "zcopy": true, 00:28:34.870 "get_zone_info": false, 00:28:34.870 "zone_management": false, 00:28:34.870 "zone_append": false, 00:28:34.870 "compare": false, 00:28:34.870 "compare_and_write": false, 00:28:34.870 "abort": true, 00:28:34.870 "seek_hole": false, 00:28:34.870 "seek_data": false, 00:28:34.870 "copy": true, 00:28:34.870 "nvme_iov_md": false 00:28:34.870 }, 00:28:34.870 "memory_domains": [ 00:28:34.870 { 00:28:34.870 "dma_device_id": "system", 00:28:34.870 "dma_device_type": 1 00:28:34.870 }, 00:28:34.870 { 00:28:34.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:34.870 "dma_device_type": 2 00:28:34.870 } 00:28:34.870 ], 00:28:34.870 "driver_specific": {} 00:28:34.870 }' 00:28:34.870 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:35.129 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:35.129 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:35.129 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:35.129 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:35.129 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:35.129 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:35.129 05:56:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:35.129 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:35.129 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:35.387 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:35.387 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:35.387 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:35.645 [2024-07-26 05:56:50.310043] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.645 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:35.903 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:35.903 "name": "Existed_Raid", 00:28:35.903 "uuid": "ed81174e-dca0-43be-b17b-13e48aee3f8e", 00:28:35.903 "strip_size_kb": 0, 00:28:35.903 "state": "online", 00:28:35.903 "raid_level": "raid1", 00:28:35.903 "superblock": true, 00:28:35.903 "num_base_bdevs": 2, 00:28:35.903 "num_base_bdevs_discovered": 1, 00:28:35.903 "num_base_bdevs_operational": 1, 00:28:35.903 "base_bdevs_list": [ 00:28:35.903 { 00:28:35.903 "name": null, 00:28:35.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:35.903 "is_configured": false, 00:28:35.903 "data_offset": 256, 00:28:35.903 "data_size": 7936 00:28:35.903 }, 00:28:35.903 { 00:28:35.903 "name": "BaseBdev2", 00:28:35.903 "uuid": "19c5414e-9c74-4540-9a2c-81a767546169", 00:28:35.903 "is_configured": true, 00:28:35.903 "data_offset": 256, 00:28:35.903 "data_size": 7936 00:28:35.903 } 00:28:35.903 ] 00:28:35.903 }' 00:28:35.903 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:35.903 05:56:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:36.466 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:36.466 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:36.466 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:36.466 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.723 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:36.723 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:36.723 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:36.982 [2024-07-26 05:56:51.654729] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:36.982 [2024-07-26 05:56:51.654816] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:36.982 [2024-07-26 05:56:51.665711] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:36.982 [2024-07-26 05:56:51.665747] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:36.982 [2024-07-26 05:56:51.665758] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x241c000 name Existed_Raid, state offline 00:28:36.982 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:36.982 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:36.982 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.982 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1265267 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1265267 ']' 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1265267 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1265267 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1265267' 00:28:37.240 killing process with pid 1265267 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1265267 00:28:37.240 [2024-07-26 05:56:51.982686] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:37.240 05:56:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1265267 00:28:37.240 [2024-07-26 05:56:51.983547] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:37.498 05:56:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:28:37.498 00:28:37.498 real 0m10.664s 00:28:37.498 user 0m18.936s 00:28:37.498 sys 0m2.027s 00:28:37.498 05:56:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:37.498 05:56:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:37.498 ************************************ 00:28:37.498 END TEST raid_state_function_test_sb_4k 00:28:37.498 ************************************ 00:28:37.498 05:56:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:37.498 05:56:52 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:28:37.498 05:56:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:37.498 05:56:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:37.498 05:56:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:37.498 ************************************ 00:28:37.498 START TEST raid_superblock_test_4k 00:28:37.498 ************************************ 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=1266778 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 1266778 /var/tmp/spdk-raid.sock 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 1266778 ']' 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:37.498 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:37.498 05:56:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:37.499 05:56:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:37.499 [2024-07-26 05:56:52.332196] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:28:37.499 [2024-07-26 05:56:52.332257] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1266778 ] 00:28:37.756 [2024-07-26 05:56:52.461721] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:37.756 [2024-07-26 05:56:52.567352] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:37.756 [2024-07-26 05:56:52.637262] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:37.756 [2024-07-26 05:56:52.637316] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:38.691 05:56:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:38.691 05:56:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:28:38.692 malloc1 00:28:38.692 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:38.949 [2024-07-26 05:56:53.748402] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:38.949 [2024-07-26 05:56:53.748450] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:38.949 [2024-07-26 05:56:53.748473] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b93570 00:28:38.949 [2024-07-26 05:56:53.748486] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:38.949 [2024-07-26 05:56:53.750198] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:38.949 [2024-07-26 05:56:53.750226] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:38.949 pt1 00:28:38.949 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:38.949 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:38.949 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:38.949 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:38.949 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:38.949 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:38.949 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:38.949 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:38.949 05:56:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:28:39.206 malloc2 00:28:39.206 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:39.464 [2024-07-26 05:56:54.246492] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:39.464 [2024-07-26 05:56:54.246537] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:39.464 [2024-07-26 05:56:54.246556] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b94970 00:28:39.464 [2024-07-26 05:56:54.246569] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:39.464 [2024-07-26 05:56:54.248185] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:39.464 [2024-07-26 05:56:54.248212] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:39.464 pt2 00:28:39.464 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:39.464 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:39.464 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:39.722 [2024-07-26 05:56:54.491161] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:39.722 [2024-07-26 05:56:54.492508] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:39.722 [2024-07-26 05:56:54.492673] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d37270 00:28:39.722 [2024-07-26 05:56:54.492688] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:39.722 [2024-07-26 05:56:54.492893] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b8b0e0 00:28:39.722 [2024-07-26 05:56:54.493040] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d37270 00:28:39.722 [2024-07-26 05:56:54.493051] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d37270 00:28:39.722 [2024-07-26 05:56:54.493154] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:39.722 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:39.980 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:39.980 "name": "raid_bdev1", 00:28:39.980 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:39.980 "strip_size_kb": 0, 00:28:39.980 "state": "online", 00:28:39.980 "raid_level": "raid1", 00:28:39.980 "superblock": true, 00:28:39.980 "num_base_bdevs": 2, 00:28:39.980 "num_base_bdevs_discovered": 2, 00:28:39.980 "num_base_bdevs_operational": 2, 00:28:39.980 "base_bdevs_list": [ 00:28:39.980 { 00:28:39.980 "name": "pt1", 00:28:39.980 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:39.980 "is_configured": true, 00:28:39.980 "data_offset": 256, 00:28:39.980 "data_size": 7936 00:28:39.980 }, 00:28:39.980 { 00:28:39.980 "name": "pt2", 00:28:39.980 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:39.980 "is_configured": true, 00:28:39.980 "data_offset": 256, 00:28:39.980 "data_size": 7936 00:28:39.980 } 00:28:39.980 ] 00:28:39.980 }' 00:28:39.980 05:56:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:39.980 05:56:54 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:40.546 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:40.546 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:40.546 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:40.546 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:40.546 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:40.546 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:40.546 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:40.546 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:40.803 [2024-07-26 05:56:55.594265] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:40.803 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:40.803 "name": "raid_bdev1", 00:28:40.803 "aliases": [ 00:28:40.803 "995a0ac5-2a55-4541-83e6-79615ea26253" 00:28:40.803 ], 00:28:40.803 "product_name": "Raid Volume", 00:28:40.803 "block_size": 4096, 00:28:40.803 "num_blocks": 7936, 00:28:40.803 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:40.803 "assigned_rate_limits": { 00:28:40.803 "rw_ios_per_sec": 0, 00:28:40.803 "rw_mbytes_per_sec": 0, 00:28:40.803 "r_mbytes_per_sec": 0, 00:28:40.803 "w_mbytes_per_sec": 0 00:28:40.803 }, 00:28:40.803 "claimed": false, 00:28:40.803 "zoned": false, 00:28:40.803 "supported_io_types": { 00:28:40.803 "read": true, 00:28:40.803 "write": true, 00:28:40.803 "unmap": false, 00:28:40.803 "flush": false, 00:28:40.804 "reset": true, 00:28:40.804 "nvme_admin": false, 00:28:40.804 "nvme_io": false, 00:28:40.804 "nvme_io_md": false, 00:28:40.804 "write_zeroes": true, 00:28:40.804 "zcopy": false, 00:28:40.804 "get_zone_info": false, 00:28:40.804 "zone_management": false, 00:28:40.804 "zone_append": false, 00:28:40.804 "compare": false, 00:28:40.804 "compare_and_write": false, 00:28:40.804 "abort": false, 00:28:40.804 "seek_hole": false, 00:28:40.804 "seek_data": false, 00:28:40.804 "copy": false, 00:28:40.804 "nvme_iov_md": false 00:28:40.804 }, 00:28:40.804 "memory_domains": [ 00:28:40.804 { 00:28:40.804 "dma_device_id": "system", 00:28:40.804 "dma_device_type": 1 00:28:40.804 }, 00:28:40.804 { 00:28:40.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:40.804 "dma_device_type": 2 00:28:40.804 }, 00:28:40.804 { 00:28:40.804 "dma_device_id": "system", 00:28:40.804 "dma_device_type": 1 00:28:40.804 }, 00:28:40.804 { 00:28:40.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:40.804 "dma_device_type": 2 00:28:40.804 } 00:28:40.804 ], 00:28:40.804 "driver_specific": { 00:28:40.804 "raid": { 00:28:40.804 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:40.804 "strip_size_kb": 0, 00:28:40.804 "state": "online", 00:28:40.804 "raid_level": "raid1", 00:28:40.804 "superblock": true, 00:28:40.804 "num_base_bdevs": 2, 00:28:40.804 "num_base_bdevs_discovered": 2, 00:28:40.804 "num_base_bdevs_operational": 2, 00:28:40.804 "base_bdevs_list": [ 00:28:40.804 { 00:28:40.804 "name": "pt1", 00:28:40.804 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:40.804 "is_configured": true, 00:28:40.804 "data_offset": 256, 00:28:40.804 "data_size": 7936 00:28:40.804 }, 00:28:40.804 { 00:28:40.804 "name": "pt2", 00:28:40.804 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:40.804 "is_configured": true, 00:28:40.804 "data_offset": 256, 00:28:40.804 "data_size": 7936 00:28:40.804 } 00:28:40.804 ] 00:28:40.804 } 00:28:40.804 } 00:28:40.804 }' 00:28:40.804 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:40.804 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:40.804 pt2' 00:28:40.804 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:40.804 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:40.804 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:41.061 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:41.061 "name": "pt1", 00:28:41.061 "aliases": [ 00:28:41.061 "00000000-0000-0000-0000-000000000001" 00:28:41.061 ], 00:28:41.061 "product_name": "passthru", 00:28:41.061 "block_size": 4096, 00:28:41.061 "num_blocks": 8192, 00:28:41.061 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:41.061 "assigned_rate_limits": { 00:28:41.061 "rw_ios_per_sec": 0, 00:28:41.061 "rw_mbytes_per_sec": 0, 00:28:41.061 "r_mbytes_per_sec": 0, 00:28:41.061 "w_mbytes_per_sec": 0 00:28:41.061 }, 00:28:41.061 "claimed": true, 00:28:41.061 "claim_type": "exclusive_write", 00:28:41.061 "zoned": false, 00:28:41.061 "supported_io_types": { 00:28:41.061 "read": true, 00:28:41.061 "write": true, 00:28:41.061 "unmap": true, 00:28:41.061 "flush": true, 00:28:41.061 "reset": true, 00:28:41.061 "nvme_admin": false, 00:28:41.061 "nvme_io": false, 00:28:41.061 "nvme_io_md": false, 00:28:41.061 "write_zeroes": true, 00:28:41.061 "zcopy": true, 00:28:41.061 "get_zone_info": false, 00:28:41.061 "zone_management": false, 00:28:41.061 "zone_append": false, 00:28:41.061 "compare": false, 00:28:41.061 "compare_and_write": false, 00:28:41.061 "abort": true, 00:28:41.061 "seek_hole": false, 00:28:41.061 "seek_data": false, 00:28:41.061 "copy": true, 00:28:41.061 "nvme_iov_md": false 00:28:41.061 }, 00:28:41.061 "memory_domains": [ 00:28:41.061 { 00:28:41.061 "dma_device_id": "system", 00:28:41.061 "dma_device_type": 1 00:28:41.061 }, 00:28:41.061 { 00:28:41.061 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:41.061 "dma_device_type": 2 00:28:41.061 } 00:28:41.061 ], 00:28:41.061 "driver_specific": { 00:28:41.061 "passthru": { 00:28:41.061 "name": "pt1", 00:28:41.061 "base_bdev_name": "malloc1" 00:28:41.061 } 00:28:41.061 } 00:28:41.061 }' 00:28:41.061 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:41.061 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:41.319 05:56:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:41.319 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:41.319 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:41.319 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:41.319 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:41.319 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:41.319 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:41.319 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:41.319 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:41.592 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:41.592 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:41.592 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:41.592 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:41.863 "name": "pt2", 00:28:41.863 "aliases": [ 00:28:41.863 "00000000-0000-0000-0000-000000000002" 00:28:41.863 ], 00:28:41.863 "product_name": "passthru", 00:28:41.863 "block_size": 4096, 00:28:41.863 "num_blocks": 8192, 00:28:41.863 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:41.863 "assigned_rate_limits": { 00:28:41.863 "rw_ios_per_sec": 0, 00:28:41.863 "rw_mbytes_per_sec": 0, 00:28:41.863 "r_mbytes_per_sec": 0, 00:28:41.863 "w_mbytes_per_sec": 0 00:28:41.863 }, 00:28:41.863 "claimed": true, 00:28:41.863 "claim_type": "exclusive_write", 00:28:41.863 "zoned": false, 00:28:41.863 "supported_io_types": { 00:28:41.863 "read": true, 00:28:41.863 "write": true, 00:28:41.863 "unmap": true, 00:28:41.863 "flush": true, 00:28:41.863 "reset": true, 00:28:41.863 "nvme_admin": false, 00:28:41.863 "nvme_io": false, 00:28:41.863 "nvme_io_md": false, 00:28:41.863 "write_zeroes": true, 00:28:41.863 "zcopy": true, 00:28:41.863 "get_zone_info": false, 00:28:41.863 "zone_management": false, 00:28:41.863 "zone_append": false, 00:28:41.863 "compare": false, 00:28:41.863 "compare_and_write": false, 00:28:41.863 "abort": true, 00:28:41.863 "seek_hole": false, 00:28:41.863 "seek_data": false, 00:28:41.863 "copy": true, 00:28:41.863 "nvme_iov_md": false 00:28:41.863 }, 00:28:41.863 "memory_domains": [ 00:28:41.863 { 00:28:41.863 "dma_device_id": "system", 00:28:41.863 "dma_device_type": 1 00:28:41.863 }, 00:28:41.863 { 00:28:41.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:41.863 "dma_device_type": 2 00:28:41.863 } 00:28:41.863 ], 00:28:41.863 "driver_specific": { 00:28:41.863 "passthru": { 00:28:41.863 "name": "pt2", 00:28:41.863 "base_bdev_name": "malloc2" 00:28:41.863 } 00:28:41.863 } 00:28:41.863 }' 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:41.863 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:42.121 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:42.121 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:42.121 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:42.121 05:56:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:42.379 [2024-07-26 05:56:57.050117] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:42.380 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=995a0ac5-2a55-4541-83e6-79615ea26253 00:28:42.380 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 995a0ac5-2a55-4541-83e6-79615ea26253 ']' 00:28:42.380 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:42.638 [2024-07-26 05:56:57.298529] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:42.638 [2024-07-26 05:56:57.298553] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:42.638 [2024-07-26 05:56:57.298610] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:42.638 [2024-07-26 05:56:57.298677] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:42.638 [2024-07-26 05:56:57.298690] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d37270 name raid_bdev1, state offline 00:28:42.638 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.638 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:42.896 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:42.896 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:42.896 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:42.896 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:43.154 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:43.154 05:56:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:43.154 05:56:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:43.154 05:56:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:43.412 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:43.670 [2024-07-26 05:56:58.525737] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:43.670 [2024-07-26 05:56:58.527079] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:43.670 [2024-07-26 05:56:58.527134] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:43.670 [2024-07-26 05:56:58.527174] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:43.670 [2024-07-26 05:56:58.527192] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:43.670 [2024-07-26 05:56:58.527201] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d36ff0 name raid_bdev1, state configuring 00:28:43.670 request: 00:28:43.670 { 00:28:43.670 "name": "raid_bdev1", 00:28:43.670 "raid_level": "raid1", 00:28:43.670 "base_bdevs": [ 00:28:43.670 "malloc1", 00:28:43.670 "malloc2" 00:28:43.670 ], 00:28:43.670 "superblock": false, 00:28:43.670 "method": "bdev_raid_create", 00:28:43.670 "req_id": 1 00:28:43.670 } 00:28:43.670 Got JSON-RPC error response 00:28:43.670 response: 00:28:43.670 { 00:28:43.670 "code": -17, 00:28:43.670 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:43.670 } 00:28:43.670 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:28:43.670 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:43.670 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:43.670 05:56:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:43.670 05:56:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.670 05:56:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:43.929 05:56:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:43.929 05:56:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:43.929 05:56:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:44.187 [2024-07-26 05:56:59.002923] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:44.187 [2024-07-26 05:56:59.002963] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:44.187 [2024-07-26 05:56:59.002987] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b937a0 00:28:44.187 [2024-07-26 05:56:59.002999] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:44.187 [2024-07-26 05:56:59.004613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:44.187 [2024-07-26 05:56:59.004648] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:44.187 [2024-07-26 05:56:59.004713] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:44.187 [2024-07-26 05:56:59.004738] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:44.187 pt1 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.187 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:44.446 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:44.446 "name": "raid_bdev1", 00:28:44.446 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:44.446 "strip_size_kb": 0, 00:28:44.446 "state": "configuring", 00:28:44.446 "raid_level": "raid1", 00:28:44.446 "superblock": true, 00:28:44.446 "num_base_bdevs": 2, 00:28:44.446 "num_base_bdevs_discovered": 1, 00:28:44.446 "num_base_bdevs_operational": 2, 00:28:44.446 "base_bdevs_list": [ 00:28:44.446 { 00:28:44.446 "name": "pt1", 00:28:44.446 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:44.446 "is_configured": true, 00:28:44.446 "data_offset": 256, 00:28:44.446 "data_size": 7936 00:28:44.446 }, 00:28:44.446 { 00:28:44.446 "name": null, 00:28:44.446 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:44.446 "is_configured": false, 00:28:44.446 "data_offset": 256, 00:28:44.446 "data_size": 7936 00:28:44.446 } 00:28:44.446 ] 00:28:44.446 }' 00:28:44.446 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:44.446 05:56:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:45.012 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:45.012 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:45.012 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:45.012 05:56:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:45.269 [2024-07-26 05:57:00.093840] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:45.269 [2024-07-26 05:57:00.093897] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:45.269 [2024-07-26 05:57:00.093919] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d2b6f0 00:28:45.269 [2024-07-26 05:57:00.093938] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:45.269 [2024-07-26 05:57:00.094287] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:45.269 [2024-07-26 05:57:00.094305] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:45.269 [2024-07-26 05:57:00.094367] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:45.269 [2024-07-26 05:57:00.094386] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:45.269 [2024-07-26 05:57:00.094481] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d2c590 00:28:45.269 [2024-07-26 05:57:00.094491] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:45.269 [2024-07-26 05:57:00.094666] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b8d540 00:28:45.269 [2024-07-26 05:57:00.094791] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d2c590 00:28:45.269 [2024-07-26 05:57:00.094801] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d2c590 00:28:45.269 [2024-07-26 05:57:00.094896] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:45.269 pt2 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.269 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.527 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:45.527 "name": "raid_bdev1", 00:28:45.527 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:45.527 "strip_size_kb": 0, 00:28:45.527 "state": "online", 00:28:45.527 "raid_level": "raid1", 00:28:45.527 "superblock": true, 00:28:45.527 "num_base_bdevs": 2, 00:28:45.527 "num_base_bdevs_discovered": 2, 00:28:45.527 "num_base_bdevs_operational": 2, 00:28:45.527 "base_bdevs_list": [ 00:28:45.527 { 00:28:45.527 "name": "pt1", 00:28:45.527 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:45.527 "is_configured": true, 00:28:45.527 "data_offset": 256, 00:28:45.527 "data_size": 7936 00:28:45.527 }, 00:28:45.527 { 00:28:45.527 "name": "pt2", 00:28:45.527 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:45.527 "is_configured": true, 00:28:45.527 "data_offset": 256, 00:28:45.527 "data_size": 7936 00:28:45.527 } 00:28:45.527 ] 00:28:45.527 }' 00:28:45.527 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:45.527 05:57:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:46.094 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:46.094 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:46.094 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:46.094 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:46.094 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:46.094 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:46.094 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:46.094 05:57:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:46.353 [2024-07-26 05:57:01.193004] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:46.353 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:46.353 "name": "raid_bdev1", 00:28:46.353 "aliases": [ 00:28:46.353 "995a0ac5-2a55-4541-83e6-79615ea26253" 00:28:46.353 ], 00:28:46.353 "product_name": "Raid Volume", 00:28:46.353 "block_size": 4096, 00:28:46.353 "num_blocks": 7936, 00:28:46.353 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:46.353 "assigned_rate_limits": { 00:28:46.353 "rw_ios_per_sec": 0, 00:28:46.353 "rw_mbytes_per_sec": 0, 00:28:46.353 "r_mbytes_per_sec": 0, 00:28:46.353 "w_mbytes_per_sec": 0 00:28:46.353 }, 00:28:46.353 "claimed": false, 00:28:46.353 "zoned": false, 00:28:46.353 "supported_io_types": { 00:28:46.353 "read": true, 00:28:46.353 "write": true, 00:28:46.353 "unmap": false, 00:28:46.353 "flush": false, 00:28:46.353 "reset": true, 00:28:46.353 "nvme_admin": false, 00:28:46.353 "nvme_io": false, 00:28:46.353 "nvme_io_md": false, 00:28:46.353 "write_zeroes": true, 00:28:46.353 "zcopy": false, 00:28:46.353 "get_zone_info": false, 00:28:46.353 "zone_management": false, 00:28:46.353 "zone_append": false, 00:28:46.353 "compare": false, 00:28:46.353 "compare_and_write": false, 00:28:46.353 "abort": false, 00:28:46.353 "seek_hole": false, 00:28:46.353 "seek_data": false, 00:28:46.353 "copy": false, 00:28:46.353 "nvme_iov_md": false 00:28:46.353 }, 00:28:46.353 "memory_domains": [ 00:28:46.353 { 00:28:46.353 "dma_device_id": "system", 00:28:46.353 "dma_device_type": 1 00:28:46.353 }, 00:28:46.353 { 00:28:46.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:46.353 "dma_device_type": 2 00:28:46.353 }, 00:28:46.353 { 00:28:46.353 "dma_device_id": "system", 00:28:46.353 "dma_device_type": 1 00:28:46.353 }, 00:28:46.353 { 00:28:46.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:46.353 "dma_device_type": 2 00:28:46.353 } 00:28:46.353 ], 00:28:46.353 "driver_specific": { 00:28:46.353 "raid": { 00:28:46.353 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:46.353 "strip_size_kb": 0, 00:28:46.353 "state": "online", 00:28:46.353 "raid_level": "raid1", 00:28:46.353 "superblock": true, 00:28:46.353 "num_base_bdevs": 2, 00:28:46.353 "num_base_bdevs_discovered": 2, 00:28:46.353 "num_base_bdevs_operational": 2, 00:28:46.353 "base_bdevs_list": [ 00:28:46.353 { 00:28:46.353 "name": "pt1", 00:28:46.353 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:46.353 "is_configured": true, 00:28:46.353 "data_offset": 256, 00:28:46.353 "data_size": 7936 00:28:46.353 }, 00:28:46.353 { 00:28:46.353 "name": "pt2", 00:28:46.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:46.353 "is_configured": true, 00:28:46.353 "data_offset": 256, 00:28:46.353 "data_size": 7936 00:28:46.353 } 00:28:46.353 ] 00:28:46.353 } 00:28:46.353 } 00:28:46.353 }' 00:28:46.353 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:46.612 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:46.612 pt2' 00:28:46.612 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:46.612 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:46.612 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:46.612 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:46.612 "name": "pt1", 00:28:46.612 "aliases": [ 00:28:46.612 "00000000-0000-0000-0000-000000000001" 00:28:46.612 ], 00:28:46.612 "product_name": "passthru", 00:28:46.612 "block_size": 4096, 00:28:46.612 "num_blocks": 8192, 00:28:46.612 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:46.612 "assigned_rate_limits": { 00:28:46.612 "rw_ios_per_sec": 0, 00:28:46.612 "rw_mbytes_per_sec": 0, 00:28:46.612 "r_mbytes_per_sec": 0, 00:28:46.612 "w_mbytes_per_sec": 0 00:28:46.612 }, 00:28:46.612 "claimed": true, 00:28:46.612 "claim_type": "exclusive_write", 00:28:46.612 "zoned": false, 00:28:46.612 "supported_io_types": { 00:28:46.612 "read": true, 00:28:46.612 "write": true, 00:28:46.612 "unmap": true, 00:28:46.612 "flush": true, 00:28:46.612 "reset": true, 00:28:46.612 "nvme_admin": false, 00:28:46.612 "nvme_io": false, 00:28:46.612 "nvme_io_md": false, 00:28:46.612 "write_zeroes": true, 00:28:46.612 "zcopy": true, 00:28:46.612 "get_zone_info": false, 00:28:46.612 "zone_management": false, 00:28:46.612 "zone_append": false, 00:28:46.612 "compare": false, 00:28:46.612 "compare_and_write": false, 00:28:46.612 "abort": true, 00:28:46.612 "seek_hole": false, 00:28:46.612 "seek_data": false, 00:28:46.612 "copy": true, 00:28:46.612 "nvme_iov_md": false 00:28:46.612 }, 00:28:46.612 "memory_domains": [ 00:28:46.612 { 00:28:46.612 "dma_device_id": "system", 00:28:46.612 "dma_device_type": 1 00:28:46.612 }, 00:28:46.612 { 00:28:46.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:46.612 "dma_device_type": 2 00:28:46.612 } 00:28:46.612 ], 00:28:46.612 "driver_specific": { 00:28:46.612 "passthru": { 00:28:46.612 "name": "pt1", 00:28:46.612 "base_bdev_name": "malloc1" 00:28:46.612 } 00:28:46.612 } 00:28:46.612 }' 00:28:46.612 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:46.872 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:46.872 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:46.872 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:46.872 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:46.872 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:46.872 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:46.872 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:47.131 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:47.131 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:47.131 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:47.131 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:47.131 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:47.131 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:47.131 05:57:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:47.390 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:47.390 "name": "pt2", 00:28:47.390 "aliases": [ 00:28:47.390 "00000000-0000-0000-0000-000000000002" 00:28:47.390 ], 00:28:47.390 "product_name": "passthru", 00:28:47.390 "block_size": 4096, 00:28:47.390 "num_blocks": 8192, 00:28:47.390 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:47.390 "assigned_rate_limits": { 00:28:47.390 "rw_ios_per_sec": 0, 00:28:47.390 "rw_mbytes_per_sec": 0, 00:28:47.390 "r_mbytes_per_sec": 0, 00:28:47.390 "w_mbytes_per_sec": 0 00:28:47.390 }, 00:28:47.390 "claimed": true, 00:28:47.390 "claim_type": "exclusive_write", 00:28:47.390 "zoned": false, 00:28:47.390 "supported_io_types": { 00:28:47.390 "read": true, 00:28:47.390 "write": true, 00:28:47.390 "unmap": true, 00:28:47.390 "flush": true, 00:28:47.390 "reset": true, 00:28:47.390 "nvme_admin": false, 00:28:47.390 "nvme_io": false, 00:28:47.390 "nvme_io_md": false, 00:28:47.390 "write_zeroes": true, 00:28:47.390 "zcopy": true, 00:28:47.390 "get_zone_info": false, 00:28:47.390 "zone_management": false, 00:28:47.390 "zone_append": false, 00:28:47.390 "compare": false, 00:28:47.390 "compare_and_write": false, 00:28:47.390 "abort": true, 00:28:47.390 "seek_hole": false, 00:28:47.390 "seek_data": false, 00:28:47.390 "copy": true, 00:28:47.390 "nvme_iov_md": false 00:28:47.390 }, 00:28:47.390 "memory_domains": [ 00:28:47.390 { 00:28:47.390 "dma_device_id": "system", 00:28:47.390 "dma_device_type": 1 00:28:47.390 }, 00:28:47.390 { 00:28:47.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:47.390 "dma_device_type": 2 00:28:47.390 } 00:28:47.390 ], 00:28:47.390 "driver_specific": { 00:28:47.390 "passthru": { 00:28:47.390 "name": "pt2", 00:28:47.390 "base_bdev_name": "malloc2" 00:28:47.390 } 00:28:47.390 } 00:28:47.390 }' 00:28:47.390 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:47.390 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:47.390 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:47.390 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:47.390 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:47.649 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:47.649 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:47.649 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:47.649 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:47.649 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:47.649 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:47.649 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:47.649 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:47.649 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:47.907 [2024-07-26 05:57:02.616748] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:47.907 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 995a0ac5-2a55-4541-83e6-79615ea26253 '!=' 995a0ac5-2a55-4541-83e6-79615ea26253 ']' 00:28:47.907 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:47.907 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:47.907 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:28:47.907 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:48.166 [2024-07-26 05:57:02.865181] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.166 05:57:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.425 05:57:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:48.425 "name": "raid_bdev1", 00:28:48.425 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:48.425 "strip_size_kb": 0, 00:28:48.425 "state": "online", 00:28:48.425 "raid_level": "raid1", 00:28:48.425 "superblock": true, 00:28:48.425 "num_base_bdevs": 2, 00:28:48.425 "num_base_bdevs_discovered": 1, 00:28:48.425 "num_base_bdevs_operational": 1, 00:28:48.425 "base_bdevs_list": [ 00:28:48.425 { 00:28:48.425 "name": null, 00:28:48.425 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.425 "is_configured": false, 00:28:48.425 "data_offset": 256, 00:28:48.425 "data_size": 7936 00:28:48.425 }, 00:28:48.425 { 00:28:48.425 "name": "pt2", 00:28:48.425 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:48.425 "is_configured": true, 00:28:48.425 "data_offset": 256, 00:28:48.425 "data_size": 7936 00:28:48.425 } 00:28:48.425 ] 00:28:48.425 }' 00:28:48.425 05:57:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:48.425 05:57:03 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:48.990 05:57:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:49.248 [2024-07-26 05:57:03.915955] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:49.248 [2024-07-26 05:57:03.915981] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:49.248 [2024-07-26 05:57:03.916036] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:49.248 [2024-07-26 05:57:03.916077] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:49.248 [2024-07-26 05:57:03.916093] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d2c590 name raid_bdev1, state offline 00:28:49.248 05:57:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:49.248 05:57:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.506 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:49.506 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:49.506 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:49.506 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:49.506 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:49.765 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:49.765 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:49.765 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:49.765 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:49.765 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:28:49.765 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:49.765 [2024-07-26 05:57:04.657886] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:49.765 [2024-07-26 05:57:04.657935] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:49.765 [2024-07-26 05:57:04.657955] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b94160 00:28:49.765 [2024-07-26 05:57:04.657968] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:49.765 [2024-07-26 05:57:04.659565] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:49.765 [2024-07-26 05:57:04.659595] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:49.765 [2024-07-26 05:57:04.659672] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:49.765 [2024-07-26 05:57:04.659699] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:49.765 [2024-07-26 05:57:04.659781] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b8a380 00:28:49.765 [2024-07-26 05:57:04.659791] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:49.765 [2024-07-26 05:57:04.659959] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b8ba80 00:28:49.765 [2024-07-26 05:57:04.660079] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b8a380 00:28:49.765 [2024-07-26 05:57:04.660089] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b8a380 00:28:49.765 [2024-07-26 05:57:04.660186] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:49.765 pt2 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:50.024 "name": "raid_bdev1", 00:28:50.024 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:50.024 "strip_size_kb": 0, 00:28:50.024 "state": "online", 00:28:50.024 "raid_level": "raid1", 00:28:50.024 "superblock": true, 00:28:50.024 "num_base_bdevs": 2, 00:28:50.024 "num_base_bdevs_discovered": 1, 00:28:50.024 "num_base_bdevs_operational": 1, 00:28:50.024 "base_bdevs_list": [ 00:28:50.024 { 00:28:50.024 "name": null, 00:28:50.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:50.024 "is_configured": false, 00:28:50.024 "data_offset": 256, 00:28:50.024 "data_size": 7936 00:28:50.024 }, 00:28:50.024 { 00:28:50.024 "name": "pt2", 00:28:50.024 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:50.024 "is_configured": true, 00:28:50.024 "data_offset": 256, 00:28:50.024 "data_size": 7936 00:28:50.024 } 00:28:50.024 ] 00:28:50.024 }' 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:50.024 05:57:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:50.591 05:57:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:50.850 [2024-07-26 05:57:05.696738] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:50.850 [2024-07-26 05:57:05.696762] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:50.850 [2024-07-26 05:57:05.696812] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:50.850 [2024-07-26 05:57:05.696854] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:50.850 [2024-07-26 05:57:05.696865] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b8a380 name raid_bdev1, state offline 00:28:50.850 05:57:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:50.850 05:57:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:51.108 05:57:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:51.108 05:57:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:51.108 05:57:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:51.108 05:57:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:51.368 [2024-07-26 05:57:06.178001] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:51.368 [2024-07-26 05:57:06.178045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:51.368 [2024-07-26 05:57:06.178065] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d36520 00:28:51.368 [2024-07-26 05:57:06.178078] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:51.368 [2024-07-26 05:57:06.179659] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:51.368 [2024-07-26 05:57:06.179688] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:51.368 [2024-07-26 05:57:06.179750] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:51.368 [2024-07-26 05:57:06.179774] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:51.368 [2024-07-26 05:57:06.179869] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:51.368 [2024-07-26 05:57:06.179882] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:51.368 [2024-07-26 05:57:06.179900] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b8b3f0 name raid_bdev1, state configuring 00:28:51.368 [2024-07-26 05:57:06.179923] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:51.368 [2024-07-26 05:57:06.179979] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b8d2b0 00:28:51.368 [2024-07-26 05:57:06.179989] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:51.368 [2024-07-26 05:57:06.180150] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b8a350 00:28:51.368 [2024-07-26 05:57:06.180267] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b8d2b0 00:28:51.368 [2024-07-26 05:57:06.180277] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b8d2b0 00:28:51.368 [2024-07-26 05:57:06.180373] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:51.368 pt1 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.368 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.627 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:51.627 "name": "raid_bdev1", 00:28:51.627 "uuid": "995a0ac5-2a55-4541-83e6-79615ea26253", 00:28:51.627 "strip_size_kb": 0, 00:28:51.627 "state": "online", 00:28:51.627 "raid_level": "raid1", 00:28:51.627 "superblock": true, 00:28:51.627 "num_base_bdevs": 2, 00:28:51.627 "num_base_bdevs_discovered": 1, 00:28:51.627 "num_base_bdevs_operational": 1, 00:28:51.627 "base_bdevs_list": [ 00:28:51.627 { 00:28:51.627 "name": null, 00:28:51.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:51.627 "is_configured": false, 00:28:51.627 "data_offset": 256, 00:28:51.627 "data_size": 7936 00:28:51.627 }, 00:28:51.627 { 00:28:51.627 "name": "pt2", 00:28:51.627 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:51.627 "is_configured": true, 00:28:51.627 "data_offset": 256, 00:28:51.627 "data_size": 7936 00:28:51.627 } 00:28:51.627 ] 00:28:51.627 }' 00:28:51.627 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:51.627 05:57:06 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:52.193 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:52.193 05:57:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:52.452 05:57:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:52.452 05:57:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:52.452 05:57:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:52.712 [2024-07-26 05:57:07.413501] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 995a0ac5-2a55-4541-83e6-79615ea26253 '!=' 995a0ac5-2a55-4541-83e6-79615ea26253 ']' 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 1266778 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 1266778 ']' 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 1266778 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1266778 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1266778' 00:28:52.712 killing process with pid 1266778 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 1266778 00:28:52.712 [2024-07-26 05:57:07.485913] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:52.712 [2024-07-26 05:57:07.485967] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:52.712 [2024-07-26 05:57:07.486011] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:52.712 [2024-07-26 05:57:07.486022] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b8d2b0 name raid_bdev1, state offline 00:28:52.712 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 1266778 00:28:52.712 [2024-07-26 05:57:07.505346] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:52.971 05:57:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:28:52.971 00:28:52.971 real 0m15.463s 00:28:52.971 user 0m28.061s 00:28:52.971 sys 0m2.840s 00:28:52.971 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:52.971 05:57:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:52.971 ************************************ 00:28:52.971 END TEST raid_superblock_test_4k 00:28:52.971 ************************************ 00:28:52.971 05:57:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:52.971 05:57:07 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:28:52.971 05:57:07 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:28:52.971 05:57:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:52.971 05:57:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:52.971 05:57:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:52.971 ************************************ 00:28:52.971 START TEST raid_rebuild_test_sb_4k 00:28:52.971 ************************************ 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=1269651 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 1269651 /var/tmp/spdk-raid.sock 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1269651 ']' 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:52.971 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:52.971 05:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:53.230 [2024-07-26 05:57:07.897101] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:28:53.230 [2024-07-26 05:57:07.897174] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1269651 ] 00:28:53.230 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:53.230 Zero copy mechanism will not be used. 00:28:53.230 [2024-07-26 05:57:08.025647] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:53.230 [2024-07-26 05:57:08.129417] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:53.487 [2024-07-26 05:57:08.189466] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:53.487 [2024-07-26 05:57:08.189495] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:54.051 05:57:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:54.051 05:57:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:28:54.051 05:57:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:54.051 05:57:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:28:54.309 BaseBdev1_malloc 00:28:54.309 05:57:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:54.566 [2024-07-26 05:57:09.310002] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:54.566 [2024-07-26 05:57:09.310055] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:54.566 [2024-07-26 05:57:09.310076] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12cad40 00:28:54.566 [2024-07-26 05:57:09.310089] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:54.566 [2024-07-26 05:57:09.311681] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:54.566 [2024-07-26 05:57:09.311710] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:54.567 BaseBdev1 00:28:54.567 05:57:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:54.567 05:57:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:28:54.825 BaseBdev2_malloc 00:28:54.825 05:57:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:55.083 [2024-07-26 05:57:09.804257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:55.083 [2024-07-26 05:57:09.804305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:55.083 [2024-07-26 05:57:09.804329] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12cb860 00:28:55.083 [2024-07-26 05:57:09.804341] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:55.083 [2024-07-26 05:57:09.805803] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:55.083 [2024-07-26 05:57:09.805830] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:55.083 BaseBdev2 00:28:55.083 05:57:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:28:55.342 spare_malloc 00:28:55.342 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:55.601 spare_delay 00:28:55.601 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:55.859 [2024-07-26 05:57:10.534952] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:55.859 [2024-07-26 05:57:10.534999] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:55.860 [2024-07-26 05:57:10.535019] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1479ec0 00:28:55.860 [2024-07-26 05:57:10.535032] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:55.860 [2024-07-26 05:57:10.536541] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:55.860 [2024-07-26 05:57:10.536569] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:55.860 spare 00:28:55.860 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:56.118 [2024-07-26 05:57:10.783632] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:56.118 [2024-07-26 05:57:10.784796] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:56.118 [2024-07-26 05:57:10.784958] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x147b070 00:28:56.118 [2024-07-26 05:57:10.784971] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:56.118 [2024-07-26 05:57:10.785154] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1474490 00:28:56.118 [2024-07-26 05:57:10.785291] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x147b070 00:28:56.118 [2024-07-26 05:57:10.785301] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x147b070 00:28:56.118 [2024-07-26 05:57:10.785397] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.118 05:57:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:56.378 05:57:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:56.378 "name": "raid_bdev1", 00:28:56.378 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:28:56.378 "strip_size_kb": 0, 00:28:56.378 "state": "online", 00:28:56.378 "raid_level": "raid1", 00:28:56.378 "superblock": true, 00:28:56.378 "num_base_bdevs": 2, 00:28:56.378 "num_base_bdevs_discovered": 2, 00:28:56.378 "num_base_bdevs_operational": 2, 00:28:56.378 "base_bdevs_list": [ 00:28:56.378 { 00:28:56.378 "name": "BaseBdev1", 00:28:56.378 "uuid": "918bd9b5-107d-5ec0-983f-95174fa76002", 00:28:56.378 "is_configured": true, 00:28:56.378 "data_offset": 256, 00:28:56.378 "data_size": 7936 00:28:56.378 }, 00:28:56.378 { 00:28:56.378 "name": "BaseBdev2", 00:28:56.378 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:28:56.378 "is_configured": true, 00:28:56.378 "data_offset": 256, 00:28:56.378 "data_size": 7936 00:28:56.378 } 00:28:56.378 ] 00:28:56.378 }' 00:28:56.378 05:57:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:56.378 05:57:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:56.945 05:57:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:56.945 05:57:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:56.945 [2024-07-26 05:57:11.806667] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:56.945 05:57:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:56.945 05:57:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:56.945 05:57:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:57.204 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:57.492 [2024-07-26 05:57:12.175441] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1474490 00:28:57.492 /dev/nbd0 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:57.492 1+0 records in 00:28:57.492 1+0 records out 00:28:57.492 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000192425 s, 21.3 MB/s 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:57.492 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:57.493 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:28:57.493 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:57.493 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:57.493 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:28:57.493 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:28:57.493 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:58.428 7936+0 records in 00:28:58.428 7936+0 records out 00:28:58.428 32505856 bytes (33 MB, 31 MiB) copied, 0.756513 s, 43.0 MB/s 00:28:58.428 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:58.428 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:58.428 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:58.428 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:58.428 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:28:58.428 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:58.428 05:57:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:58.428 [2024-07-26 05:57:13.261340] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:58.428 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:58.428 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:58.428 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:58.428 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:58.428 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:58.428 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:58.428 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:58.428 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:58.428 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:58.687 [2024-07-26 05:57:13.502029] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:58.687 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:58.945 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:58.945 "name": "raid_bdev1", 00:28:58.945 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:28:58.945 "strip_size_kb": 0, 00:28:58.945 "state": "online", 00:28:58.945 "raid_level": "raid1", 00:28:58.945 "superblock": true, 00:28:58.945 "num_base_bdevs": 2, 00:28:58.945 "num_base_bdevs_discovered": 1, 00:28:58.945 "num_base_bdevs_operational": 1, 00:28:58.945 "base_bdevs_list": [ 00:28:58.945 { 00:28:58.945 "name": null, 00:28:58.945 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:58.945 "is_configured": false, 00:28:58.945 "data_offset": 256, 00:28:58.945 "data_size": 7936 00:28:58.945 }, 00:28:58.945 { 00:28:58.945 "name": "BaseBdev2", 00:28:58.945 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:28:58.945 "is_configured": true, 00:28:58.945 "data_offset": 256, 00:28:58.945 "data_size": 7936 00:28:58.945 } 00:28:58.945 ] 00:28:58.945 }' 00:28:58.945 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:58.945 05:57:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:59.509 05:57:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:59.766 [2024-07-26 05:57:14.580893] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:59.766 [2024-07-26 05:57:14.586179] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1474490 00:28:59.766 [2024-07-26 05:57:14.588417] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:59.766 05:57:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:01.138 "name": "raid_bdev1", 00:29:01.138 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:01.138 "strip_size_kb": 0, 00:29:01.138 "state": "online", 00:29:01.138 "raid_level": "raid1", 00:29:01.138 "superblock": true, 00:29:01.138 "num_base_bdevs": 2, 00:29:01.138 "num_base_bdevs_discovered": 2, 00:29:01.138 "num_base_bdevs_operational": 2, 00:29:01.138 "process": { 00:29:01.138 "type": "rebuild", 00:29:01.138 "target": "spare", 00:29:01.138 "progress": { 00:29:01.138 "blocks": 3072, 00:29:01.138 "percent": 38 00:29:01.138 } 00:29:01.138 }, 00:29:01.138 "base_bdevs_list": [ 00:29:01.138 { 00:29:01.138 "name": "spare", 00:29:01.138 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:01.138 "is_configured": true, 00:29:01.138 "data_offset": 256, 00:29:01.138 "data_size": 7936 00:29:01.138 }, 00:29:01.138 { 00:29:01.138 "name": "BaseBdev2", 00:29:01.138 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:01.138 "is_configured": true, 00:29:01.138 "data_offset": 256, 00:29:01.138 "data_size": 7936 00:29:01.138 } 00:29:01.138 ] 00:29:01.138 }' 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:01.138 05:57:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:01.396 [2024-07-26 05:57:16.170822] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:01.396 [2024-07-26 05:57:16.201195] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:01.396 [2024-07-26 05:57:16.201242] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:01.396 [2024-07-26 05:57:16.201257] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:01.396 [2024-07-26 05:57:16.201265] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.396 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.654 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:01.654 "name": "raid_bdev1", 00:29:01.654 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:01.654 "strip_size_kb": 0, 00:29:01.654 "state": "online", 00:29:01.654 "raid_level": "raid1", 00:29:01.654 "superblock": true, 00:29:01.654 "num_base_bdevs": 2, 00:29:01.654 "num_base_bdevs_discovered": 1, 00:29:01.654 "num_base_bdevs_operational": 1, 00:29:01.654 "base_bdevs_list": [ 00:29:01.654 { 00:29:01.654 "name": null, 00:29:01.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:01.654 "is_configured": false, 00:29:01.654 "data_offset": 256, 00:29:01.654 "data_size": 7936 00:29:01.654 }, 00:29:01.654 { 00:29:01.654 "name": "BaseBdev2", 00:29:01.654 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:01.654 "is_configured": true, 00:29:01.654 "data_offset": 256, 00:29:01.654 "data_size": 7936 00:29:01.654 } 00:29:01.654 ] 00:29:01.654 }' 00:29:01.654 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:01.654 05:57:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:02.220 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:02.220 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:02.220 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:02.220 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:02.220 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:02.220 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:02.220 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.479 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:02.479 "name": "raid_bdev1", 00:29:02.479 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:02.479 "strip_size_kb": 0, 00:29:02.479 "state": "online", 00:29:02.479 "raid_level": "raid1", 00:29:02.479 "superblock": true, 00:29:02.479 "num_base_bdevs": 2, 00:29:02.479 "num_base_bdevs_discovered": 1, 00:29:02.479 "num_base_bdevs_operational": 1, 00:29:02.479 "base_bdevs_list": [ 00:29:02.479 { 00:29:02.479 "name": null, 00:29:02.479 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:02.479 "is_configured": false, 00:29:02.479 "data_offset": 256, 00:29:02.479 "data_size": 7936 00:29:02.479 }, 00:29:02.479 { 00:29:02.479 "name": "BaseBdev2", 00:29:02.479 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:02.479 "is_configured": true, 00:29:02.479 "data_offset": 256, 00:29:02.479 "data_size": 7936 00:29:02.479 } 00:29:02.479 ] 00:29:02.479 }' 00:29:02.479 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:02.479 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:02.479 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:02.479 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:02.479 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:02.737 [2024-07-26 05:57:17.597364] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:02.737 [2024-07-26 05:57:17.603028] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1474490 00:29:02.737 [2024-07-26 05:57:17.604546] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:02.737 05:57:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:04.113 "name": "raid_bdev1", 00:29:04.113 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:04.113 "strip_size_kb": 0, 00:29:04.113 "state": "online", 00:29:04.113 "raid_level": "raid1", 00:29:04.113 "superblock": true, 00:29:04.113 "num_base_bdevs": 2, 00:29:04.113 "num_base_bdevs_discovered": 2, 00:29:04.113 "num_base_bdevs_operational": 2, 00:29:04.113 "process": { 00:29:04.113 "type": "rebuild", 00:29:04.113 "target": "spare", 00:29:04.113 "progress": { 00:29:04.113 "blocks": 3072, 00:29:04.113 "percent": 38 00:29:04.113 } 00:29:04.113 }, 00:29:04.113 "base_bdevs_list": [ 00:29:04.113 { 00:29:04.113 "name": "spare", 00:29:04.113 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:04.113 "is_configured": true, 00:29:04.113 "data_offset": 256, 00:29:04.113 "data_size": 7936 00:29:04.113 }, 00:29:04.113 { 00:29:04.113 "name": "BaseBdev2", 00:29:04.113 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:04.113 "is_configured": true, 00:29:04.113 "data_offset": 256, 00:29:04.113 "data_size": 7936 00:29:04.113 } 00:29:04.113 ] 00:29:04.113 }' 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:29:04.113 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1008 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.113 05:57:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:04.371 05:57:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:04.372 "name": "raid_bdev1", 00:29:04.372 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:04.372 "strip_size_kb": 0, 00:29:04.372 "state": "online", 00:29:04.372 "raid_level": "raid1", 00:29:04.372 "superblock": true, 00:29:04.372 "num_base_bdevs": 2, 00:29:04.372 "num_base_bdevs_discovered": 2, 00:29:04.372 "num_base_bdevs_operational": 2, 00:29:04.372 "process": { 00:29:04.372 "type": "rebuild", 00:29:04.372 "target": "spare", 00:29:04.372 "progress": { 00:29:04.372 "blocks": 3584, 00:29:04.372 "percent": 45 00:29:04.372 } 00:29:04.372 }, 00:29:04.372 "base_bdevs_list": [ 00:29:04.372 { 00:29:04.372 "name": "spare", 00:29:04.372 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:04.372 "is_configured": true, 00:29:04.372 "data_offset": 256, 00:29:04.372 "data_size": 7936 00:29:04.372 }, 00:29:04.372 { 00:29:04.372 "name": "BaseBdev2", 00:29:04.372 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:04.372 "is_configured": true, 00:29:04.372 "data_offset": 256, 00:29:04.372 "data_size": 7936 00:29:04.372 } 00:29:04.372 ] 00:29:04.372 }' 00:29:04.372 05:57:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:04.372 05:57:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:04.372 05:57:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:04.372 05:57:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:04.372 05:57:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:05.307 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:05.307 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:05.307 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:05.307 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:05.307 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:05.566 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:05.566 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.566 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:05.566 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:05.566 "name": "raid_bdev1", 00:29:05.566 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:05.566 "strip_size_kb": 0, 00:29:05.566 "state": "online", 00:29:05.566 "raid_level": "raid1", 00:29:05.566 "superblock": true, 00:29:05.566 "num_base_bdevs": 2, 00:29:05.566 "num_base_bdevs_discovered": 2, 00:29:05.566 "num_base_bdevs_operational": 2, 00:29:05.566 "process": { 00:29:05.566 "type": "rebuild", 00:29:05.566 "target": "spare", 00:29:05.566 "progress": { 00:29:05.566 "blocks": 7168, 00:29:05.566 "percent": 90 00:29:05.566 } 00:29:05.566 }, 00:29:05.566 "base_bdevs_list": [ 00:29:05.566 { 00:29:05.566 "name": "spare", 00:29:05.566 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:05.566 "is_configured": true, 00:29:05.566 "data_offset": 256, 00:29:05.566 "data_size": 7936 00:29:05.566 }, 00:29:05.566 { 00:29:05.566 "name": "BaseBdev2", 00:29:05.566 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:05.566 "is_configured": true, 00:29:05.566 "data_offset": 256, 00:29:05.566 "data_size": 7936 00:29:05.566 } 00:29:05.566 ] 00:29:05.566 }' 00:29:05.566 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:05.825 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:05.825 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:05.825 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:05.825 05:57:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:05.825 [2024-07-26 05:57:20.728647] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:05.825 [2024-07-26 05:57:20.728706] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:05.825 [2024-07-26 05:57:20.728786] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:06.760 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:06.760 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:06.760 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:06.760 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:06.760 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:06.760 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:06.760 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.760 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:07.019 "name": "raid_bdev1", 00:29:07.019 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:07.019 "strip_size_kb": 0, 00:29:07.019 "state": "online", 00:29:07.019 "raid_level": "raid1", 00:29:07.019 "superblock": true, 00:29:07.019 "num_base_bdevs": 2, 00:29:07.019 "num_base_bdevs_discovered": 2, 00:29:07.019 "num_base_bdevs_operational": 2, 00:29:07.019 "base_bdevs_list": [ 00:29:07.019 { 00:29:07.019 "name": "spare", 00:29:07.019 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:07.019 "is_configured": true, 00:29:07.019 "data_offset": 256, 00:29:07.019 "data_size": 7936 00:29:07.019 }, 00:29:07.019 { 00:29:07.019 "name": "BaseBdev2", 00:29:07.019 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:07.019 "is_configured": true, 00:29:07.019 "data_offset": 256, 00:29:07.019 "data_size": 7936 00:29:07.019 } 00:29:07.019 ] 00:29:07.019 }' 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.019 05:57:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.277 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:07.277 "name": "raid_bdev1", 00:29:07.277 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:07.277 "strip_size_kb": 0, 00:29:07.277 "state": "online", 00:29:07.277 "raid_level": "raid1", 00:29:07.277 "superblock": true, 00:29:07.277 "num_base_bdevs": 2, 00:29:07.277 "num_base_bdevs_discovered": 2, 00:29:07.277 "num_base_bdevs_operational": 2, 00:29:07.277 "base_bdevs_list": [ 00:29:07.277 { 00:29:07.277 "name": "spare", 00:29:07.277 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:07.277 "is_configured": true, 00:29:07.277 "data_offset": 256, 00:29:07.277 "data_size": 7936 00:29:07.277 }, 00:29:07.277 { 00:29:07.277 "name": "BaseBdev2", 00:29:07.277 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:07.277 "is_configured": true, 00:29:07.277 "data_offset": 256, 00:29:07.277 "data_size": 7936 00:29:07.277 } 00:29:07.277 ] 00:29:07.277 }' 00:29:07.277 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:07.277 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:07.277 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.536 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.795 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:07.795 "name": "raid_bdev1", 00:29:07.795 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:07.795 "strip_size_kb": 0, 00:29:07.795 "state": "online", 00:29:07.795 "raid_level": "raid1", 00:29:07.795 "superblock": true, 00:29:07.795 "num_base_bdevs": 2, 00:29:07.795 "num_base_bdevs_discovered": 2, 00:29:07.795 "num_base_bdevs_operational": 2, 00:29:07.795 "base_bdevs_list": [ 00:29:07.795 { 00:29:07.795 "name": "spare", 00:29:07.795 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:07.795 "is_configured": true, 00:29:07.795 "data_offset": 256, 00:29:07.795 "data_size": 7936 00:29:07.795 }, 00:29:07.795 { 00:29:07.795 "name": "BaseBdev2", 00:29:07.795 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:07.795 "is_configured": true, 00:29:07.795 "data_offset": 256, 00:29:07.795 "data_size": 7936 00:29:07.795 } 00:29:07.795 ] 00:29:07.795 }' 00:29:07.795 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:07.795 05:57:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:08.362 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:08.362 [2024-07-26 05:57:23.223861] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:08.362 [2024-07-26 05:57:23.223891] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:08.362 [2024-07-26 05:57:23.223953] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:08.362 [2024-07-26 05:57:23.224012] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:08.362 [2024-07-26 05:57:23.224023] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x147b070 name raid_bdev1, state offline 00:29:08.362 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.362 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:08.620 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:29:08.878 /dev/nbd0 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:08.878 1+0 records in 00:29:08.878 1+0 records out 00:29:08.878 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244113 s, 16.8 MB/s 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:08.878 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:29:09.137 /dev/nbd1 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:09.137 1+0 records in 00:29:09.137 1+0 records out 00:29:09.137 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342539 s, 12.0 MB/s 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:09.137 05:57:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:09.397 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:09.656 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:09.915 [2024-07-26 05:57:24.670842] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:09.915 [2024-07-26 05:57:24.670890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:09.916 [2024-07-26 05:57:24.670914] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e6b00 00:29:09.916 [2024-07-26 05:57:24.670927] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:09.916 [2024-07-26 05:57:24.672562] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:09.916 [2024-07-26 05:57:24.672592] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:09.916 [2024-07-26 05:57:24.672680] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:09.916 [2024-07-26 05:57:24.672709] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:09.916 [2024-07-26 05:57:24.672810] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:09.916 spare 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.916 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:09.916 [2024-07-26 05:57:24.773123] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x14790c0 00:29:09.916 [2024-07-26 05:57:24.773139] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:09.916 [2024-07-26 05:57:24.773340] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x147c2e0 00:29:09.916 [2024-07-26 05:57:24.773496] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14790c0 00:29:09.916 [2024-07-26 05:57:24.773507] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14790c0 00:29:09.916 [2024-07-26 05:57:24.773613] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:10.175 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:10.175 "name": "raid_bdev1", 00:29:10.175 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:10.175 "strip_size_kb": 0, 00:29:10.175 "state": "online", 00:29:10.175 "raid_level": "raid1", 00:29:10.175 "superblock": true, 00:29:10.175 "num_base_bdevs": 2, 00:29:10.175 "num_base_bdevs_discovered": 2, 00:29:10.175 "num_base_bdevs_operational": 2, 00:29:10.175 "base_bdevs_list": [ 00:29:10.175 { 00:29:10.175 "name": "spare", 00:29:10.175 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:10.175 "is_configured": true, 00:29:10.175 "data_offset": 256, 00:29:10.175 "data_size": 7936 00:29:10.175 }, 00:29:10.175 { 00:29:10.175 "name": "BaseBdev2", 00:29:10.175 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:10.175 "is_configured": true, 00:29:10.175 "data_offset": 256, 00:29:10.175 "data_size": 7936 00:29:10.175 } 00:29:10.175 ] 00:29:10.175 }' 00:29:10.175 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:10.175 05:57:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:10.743 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:10.743 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:10.743 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:10.743 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:10.743 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:10.743 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:10.743 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.743 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:10.743 "name": "raid_bdev1", 00:29:10.743 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:10.743 "strip_size_kb": 0, 00:29:10.743 "state": "online", 00:29:10.743 "raid_level": "raid1", 00:29:10.743 "superblock": true, 00:29:10.743 "num_base_bdevs": 2, 00:29:10.743 "num_base_bdevs_discovered": 2, 00:29:10.743 "num_base_bdevs_operational": 2, 00:29:10.743 "base_bdevs_list": [ 00:29:10.743 { 00:29:10.743 "name": "spare", 00:29:10.743 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:10.743 "is_configured": true, 00:29:10.743 "data_offset": 256, 00:29:10.743 "data_size": 7936 00:29:10.743 }, 00:29:10.743 { 00:29:10.743 "name": "BaseBdev2", 00:29:10.743 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:10.743 "is_configured": true, 00:29:10.743 "data_offset": 256, 00:29:10.743 "data_size": 7936 00:29:10.743 } 00:29:10.743 ] 00:29:10.743 }' 00:29:10.743 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:11.002 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:11.002 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:11.002 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:11.002 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.002 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:11.261 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:11.261 05:57:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:11.520 [2024-07-26 05:57:26.194980] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.520 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:11.780 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:11.780 "name": "raid_bdev1", 00:29:11.780 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:11.780 "strip_size_kb": 0, 00:29:11.780 "state": "online", 00:29:11.780 "raid_level": "raid1", 00:29:11.780 "superblock": true, 00:29:11.780 "num_base_bdevs": 2, 00:29:11.780 "num_base_bdevs_discovered": 1, 00:29:11.780 "num_base_bdevs_operational": 1, 00:29:11.780 "base_bdevs_list": [ 00:29:11.780 { 00:29:11.780 "name": null, 00:29:11.780 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:11.780 "is_configured": false, 00:29:11.780 "data_offset": 256, 00:29:11.780 "data_size": 7936 00:29:11.780 }, 00:29:11.780 { 00:29:11.780 "name": "BaseBdev2", 00:29:11.780 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:11.780 "is_configured": true, 00:29:11.780 "data_offset": 256, 00:29:11.780 "data_size": 7936 00:29:11.780 } 00:29:11.780 ] 00:29:11.780 }' 00:29:11.780 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:11.780 05:57:26 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:12.347 05:57:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:12.348 [2024-07-26 05:57:27.185615] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:12.348 [2024-07-26 05:57:27.185772] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:12.348 [2024-07-26 05:57:27.185789] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:12.348 [2024-07-26 05:57:27.185817] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:12.348 [2024-07-26 05:57:27.190606] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12ca690 00:29:12.348 [2024-07-26 05:57:27.192936] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:12.348 05:57:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:13.724 "name": "raid_bdev1", 00:29:13.724 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:13.724 "strip_size_kb": 0, 00:29:13.724 "state": "online", 00:29:13.724 "raid_level": "raid1", 00:29:13.724 "superblock": true, 00:29:13.724 "num_base_bdevs": 2, 00:29:13.724 "num_base_bdevs_discovered": 2, 00:29:13.724 "num_base_bdevs_operational": 2, 00:29:13.724 "process": { 00:29:13.724 "type": "rebuild", 00:29:13.724 "target": "spare", 00:29:13.724 "progress": { 00:29:13.724 "blocks": 3072, 00:29:13.724 "percent": 38 00:29:13.724 } 00:29:13.724 }, 00:29:13.724 "base_bdevs_list": [ 00:29:13.724 { 00:29:13.724 "name": "spare", 00:29:13.724 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:13.724 "is_configured": true, 00:29:13.724 "data_offset": 256, 00:29:13.724 "data_size": 7936 00:29:13.724 }, 00:29:13.724 { 00:29:13.724 "name": "BaseBdev2", 00:29:13.724 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:13.724 "is_configured": true, 00:29:13.724 "data_offset": 256, 00:29:13.724 "data_size": 7936 00:29:13.724 } 00:29:13.724 ] 00:29:13.724 }' 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:13.724 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:13.983 [2024-07-26 05:57:28.771086] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:13.983 [2024-07-26 05:57:28.805406] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:13.983 [2024-07-26 05:57:28.805450] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:13.983 [2024-07-26 05:57:28.805465] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:13.983 [2024-07-26 05:57:28.805473] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.983 05:57:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:14.242 05:57:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:14.242 "name": "raid_bdev1", 00:29:14.242 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:14.242 "strip_size_kb": 0, 00:29:14.242 "state": "online", 00:29:14.242 "raid_level": "raid1", 00:29:14.242 "superblock": true, 00:29:14.242 "num_base_bdevs": 2, 00:29:14.242 "num_base_bdevs_discovered": 1, 00:29:14.242 "num_base_bdevs_operational": 1, 00:29:14.242 "base_bdevs_list": [ 00:29:14.242 { 00:29:14.242 "name": null, 00:29:14.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:14.242 "is_configured": false, 00:29:14.242 "data_offset": 256, 00:29:14.242 "data_size": 7936 00:29:14.242 }, 00:29:14.242 { 00:29:14.242 "name": "BaseBdev2", 00:29:14.242 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:14.242 "is_configured": true, 00:29:14.242 "data_offset": 256, 00:29:14.242 "data_size": 7936 00:29:14.242 } 00:29:14.242 ] 00:29:14.242 }' 00:29:14.242 05:57:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:14.242 05:57:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:14.809 05:57:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:15.069 [2024-07-26 05:57:29.909442] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:15.069 [2024-07-26 05:57:29.909494] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:15.069 [2024-07-26 05:57:29.909516] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1475860 00:29:15.069 [2024-07-26 05:57:29.909529] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:15.069 [2024-07-26 05:57:29.909922] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:15.069 [2024-07-26 05:57:29.909941] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:15.069 [2024-07-26 05:57:29.910025] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:15.069 [2024-07-26 05:57:29.910038] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:15.069 [2024-07-26 05:57:29.910049] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:15.069 [2024-07-26 05:57:29.910073] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:15.069 [2024-07-26 05:57:29.914955] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x147c8f0 00:29:15.069 spare 00:29:15.069 [2024-07-26 05:57:29.916461] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:15.069 05:57:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:16.073 05:57:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:16.073 05:57:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:16.073 05:57:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:16.073 05:57:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:16.073 05:57:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:16.073 05:57:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.073 05:57:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:16.332 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:16.332 "name": "raid_bdev1", 00:29:16.332 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:16.332 "strip_size_kb": 0, 00:29:16.332 "state": "online", 00:29:16.332 "raid_level": "raid1", 00:29:16.332 "superblock": true, 00:29:16.332 "num_base_bdevs": 2, 00:29:16.332 "num_base_bdevs_discovered": 2, 00:29:16.332 "num_base_bdevs_operational": 2, 00:29:16.332 "process": { 00:29:16.332 "type": "rebuild", 00:29:16.332 "target": "spare", 00:29:16.332 "progress": { 00:29:16.332 "blocks": 3072, 00:29:16.332 "percent": 38 00:29:16.332 } 00:29:16.332 }, 00:29:16.332 "base_bdevs_list": [ 00:29:16.332 { 00:29:16.332 "name": "spare", 00:29:16.332 "uuid": "3d3d8359-b26d-55d1-aac3-4329c98b0a31", 00:29:16.332 "is_configured": true, 00:29:16.332 "data_offset": 256, 00:29:16.332 "data_size": 7936 00:29:16.332 }, 00:29:16.332 { 00:29:16.332 "name": "BaseBdev2", 00:29:16.332 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:16.332 "is_configured": true, 00:29:16.332 "data_offset": 256, 00:29:16.332 "data_size": 7936 00:29:16.332 } 00:29:16.332 ] 00:29:16.332 }' 00:29:16.332 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:16.332 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:16.332 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:16.590 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:16.590 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:16.849 [2024-07-26 05:57:31.511366] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:16.849 [2024-07-26 05:57:31.529162] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:16.849 [2024-07-26 05:57:31.529206] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:16.849 [2024-07-26 05:57:31.529221] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:16.849 [2024-07-26 05:57:31.529230] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.849 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.108 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.108 "name": "raid_bdev1", 00:29:17.108 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:17.108 "strip_size_kb": 0, 00:29:17.108 "state": "online", 00:29:17.108 "raid_level": "raid1", 00:29:17.108 "superblock": true, 00:29:17.108 "num_base_bdevs": 2, 00:29:17.108 "num_base_bdevs_discovered": 1, 00:29:17.108 "num_base_bdevs_operational": 1, 00:29:17.108 "base_bdevs_list": [ 00:29:17.108 { 00:29:17.108 "name": null, 00:29:17.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:17.108 "is_configured": false, 00:29:17.108 "data_offset": 256, 00:29:17.108 "data_size": 7936 00:29:17.108 }, 00:29:17.108 { 00:29:17.108 "name": "BaseBdev2", 00:29:17.108 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:17.108 "is_configured": true, 00:29:17.108 "data_offset": 256, 00:29:17.108 "data_size": 7936 00:29:17.108 } 00:29:17.108 ] 00:29:17.108 }' 00:29:17.108 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.108 05:57:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:17.676 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:17.676 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:17.676 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:17.676 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:17.676 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:17.676 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.676 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.935 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:17.935 "name": "raid_bdev1", 00:29:17.935 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:17.935 "strip_size_kb": 0, 00:29:17.935 "state": "online", 00:29:17.935 "raid_level": "raid1", 00:29:17.935 "superblock": true, 00:29:17.935 "num_base_bdevs": 2, 00:29:17.935 "num_base_bdevs_discovered": 1, 00:29:17.935 "num_base_bdevs_operational": 1, 00:29:17.935 "base_bdevs_list": [ 00:29:17.935 { 00:29:17.935 "name": null, 00:29:17.935 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:17.935 "is_configured": false, 00:29:17.935 "data_offset": 256, 00:29:17.935 "data_size": 7936 00:29:17.935 }, 00:29:17.935 { 00:29:17.935 "name": "BaseBdev2", 00:29:17.935 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:17.935 "is_configured": true, 00:29:17.935 "data_offset": 256, 00:29:17.935 "data_size": 7936 00:29:17.935 } 00:29:17.935 ] 00:29:17.935 }' 00:29:17.935 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:17.935 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:17.935 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:17.935 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:17.935 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:18.195 05:57:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:18.453 [2024-07-26 05:57:33.185722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:18.453 [2024-07-26 05:57:33.185774] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:18.453 [2024-07-26 05:57:33.185795] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1475aa0 00:29:18.453 [2024-07-26 05:57:33.185808] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:18.453 [2024-07-26 05:57:33.186186] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:18.453 [2024-07-26 05:57:33.186205] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:18.453 [2024-07-26 05:57:33.186276] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:18.453 [2024-07-26 05:57:33.186289] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:18.453 [2024-07-26 05:57:33.186299] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:18.453 BaseBdev1 00:29:18.453 05:57:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:19.387 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:19.646 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:19.646 "name": "raid_bdev1", 00:29:19.646 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:19.646 "strip_size_kb": 0, 00:29:19.646 "state": "online", 00:29:19.646 "raid_level": "raid1", 00:29:19.646 "superblock": true, 00:29:19.646 "num_base_bdevs": 2, 00:29:19.646 "num_base_bdevs_discovered": 1, 00:29:19.646 "num_base_bdevs_operational": 1, 00:29:19.646 "base_bdevs_list": [ 00:29:19.646 { 00:29:19.646 "name": null, 00:29:19.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:19.646 "is_configured": false, 00:29:19.646 "data_offset": 256, 00:29:19.646 "data_size": 7936 00:29:19.646 }, 00:29:19.646 { 00:29:19.646 "name": "BaseBdev2", 00:29:19.646 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:19.646 "is_configured": true, 00:29:19.646 "data_offset": 256, 00:29:19.646 "data_size": 7936 00:29:19.646 } 00:29:19.646 ] 00:29:19.646 }' 00:29:19.646 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:19.646 05:57:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:20.213 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:20.213 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:20.213 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:20.213 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:20.213 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:20.213 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.213 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:20.472 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:20.472 "name": "raid_bdev1", 00:29:20.472 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:20.472 "strip_size_kb": 0, 00:29:20.472 "state": "online", 00:29:20.472 "raid_level": "raid1", 00:29:20.472 "superblock": true, 00:29:20.472 "num_base_bdevs": 2, 00:29:20.472 "num_base_bdevs_discovered": 1, 00:29:20.472 "num_base_bdevs_operational": 1, 00:29:20.472 "base_bdevs_list": [ 00:29:20.472 { 00:29:20.472 "name": null, 00:29:20.472 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:20.472 "is_configured": false, 00:29:20.472 "data_offset": 256, 00:29:20.472 "data_size": 7936 00:29:20.472 }, 00:29:20.472 { 00:29:20.472 "name": "BaseBdev2", 00:29:20.472 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:20.472 "is_configured": true, 00:29:20.472 "data_offset": 256, 00:29:20.472 "data_size": 7936 00:29:20.472 } 00:29:20.472 ] 00:29:20.472 }' 00:29:20.472 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:20.472 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:20.472 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:20.730 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:20.731 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:20.731 [2024-07-26 05:57:35.636241] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:20.731 [2024-07-26 05:57:35.636371] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:20.731 [2024-07-26 05:57:35.636387] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:20.989 request: 00:29:20.989 { 00:29:20.989 "base_bdev": "BaseBdev1", 00:29:20.989 "raid_bdev": "raid_bdev1", 00:29:20.989 "method": "bdev_raid_add_base_bdev", 00:29:20.989 "req_id": 1 00:29:20.989 } 00:29:20.989 Got JSON-RPC error response 00:29:20.989 response: 00:29:20.989 { 00:29:20.989 "code": -22, 00:29:20.989 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:20.989 } 00:29:20.989 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:29:20.989 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:20.989 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:20.989 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:20.989 05:57:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:21.926 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:22.184 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:22.184 "name": "raid_bdev1", 00:29:22.184 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:22.184 "strip_size_kb": 0, 00:29:22.184 "state": "online", 00:29:22.184 "raid_level": "raid1", 00:29:22.184 "superblock": true, 00:29:22.184 "num_base_bdevs": 2, 00:29:22.184 "num_base_bdevs_discovered": 1, 00:29:22.185 "num_base_bdevs_operational": 1, 00:29:22.185 "base_bdevs_list": [ 00:29:22.185 { 00:29:22.185 "name": null, 00:29:22.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:22.185 "is_configured": false, 00:29:22.185 "data_offset": 256, 00:29:22.185 "data_size": 7936 00:29:22.185 }, 00:29:22.185 { 00:29:22.185 "name": "BaseBdev2", 00:29:22.185 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:22.185 "is_configured": true, 00:29:22.185 "data_offset": 256, 00:29:22.185 "data_size": 7936 00:29:22.185 } 00:29:22.185 ] 00:29:22.185 }' 00:29:22.185 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:22.185 05:57:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:22.752 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:22.752 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:22.752 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:22.752 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:22.752 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:22.752 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.752 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:23.012 "name": "raid_bdev1", 00:29:23.012 "uuid": "8f873b3e-cd2a-4e5e-91ff-d3136856b099", 00:29:23.012 "strip_size_kb": 0, 00:29:23.012 "state": "online", 00:29:23.012 "raid_level": "raid1", 00:29:23.012 "superblock": true, 00:29:23.012 "num_base_bdevs": 2, 00:29:23.012 "num_base_bdevs_discovered": 1, 00:29:23.012 "num_base_bdevs_operational": 1, 00:29:23.012 "base_bdevs_list": [ 00:29:23.012 { 00:29:23.012 "name": null, 00:29:23.012 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.012 "is_configured": false, 00:29:23.012 "data_offset": 256, 00:29:23.012 "data_size": 7936 00:29:23.012 }, 00:29:23.012 { 00:29:23.012 "name": "BaseBdev2", 00:29:23.012 "uuid": "b0751d45-dc8a-57f9-ac23-a76f7827d6c2", 00:29:23.012 "is_configured": true, 00:29:23.012 "data_offset": 256, 00:29:23.012 "data_size": 7936 00:29:23.012 } 00:29:23.012 ] 00:29:23.012 }' 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 1269651 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1269651 ']' 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1269651 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1269651 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1269651' 00:29:23.012 killing process with pid 1269651 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1269651 00:29:23.012 Received shutdown signal, test time was about 60.000000 seconds 00:29:23.012 00:29:23.012 Latency(us) 00:29:23.012 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:23.012 =================================================================================================================== 00:29:23.012 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:23.012 [2024-07-26 05:57:37.879486] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:23.012 [2024-07-26 05:57:37.879582] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:23.012 [2024-07-26 05:57:37.879627] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:23.012 [2024-07-26 05:57:37.879645] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14790c0 name raid_bdev1, state offline 00:29:23.012 05:57:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1269651 00:29:23.012 [2024-07-26 05:57:37.906404] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:23.289 05:57:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:29:23.289 00:29:23.289 real 0m30.286s 00:29:23.289 user 0m46.982s 00:29:23.289 sys 0m4.981s 00:29:23.289 05:57:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:23.289 05:57:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:23.289 ************************************ 00:29:23.289 END TEST raid_rebuild_test_sb_4k 00:29:23.289 ************************************ 00:29:23.289 05:57:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:23.289 05:57:38 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:29:23.289 05:57:38 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:29:23.289 05:57:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:29:23.289 05:57:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:23.289 05:57:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:23.289 ************************************ 00:29:23.289 START TEST raid_state_function_test_sb_md_separate 00:29:23.289 ************************************ 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:23.289 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1273981 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1273981' 00:29:23.549 Process raid pid: 1273981 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1273981 /var/tmp/spdk-raid.sock 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1273981 ']' 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:23.549 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:23.549 05:57:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:23.549 [2024-07-26 05:57:38.261222] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:29:23.549 [2024-07-26 05:57:38.261290] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:23.549 [2024-07-26 05:57:38.381861] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:23.807 [2024-07-26 05:57:38.489839] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:23.807 [2024-07-26 05:57:38.560167] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:23.808 [2024-07-26 05:57:38.560201] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:24.375 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:24.375 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:29:24.375 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:24.634 [2024-07-26 05:57:39.419482] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:24.634 [2024-07-26 05:57:39.419521] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:24.634 [2024-07-26 05:57:39.419531] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:24.634 [2024-07-26 05:57:39.419543] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.634 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:24.893 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:24.893 "name": "Existed_Raid", 00:29:24.893 "uuid": "1ba3d721-fefc-4d88-8755-274d5197aec4", 00:29:24.893 "strip_size_kb": 0, 00:29:24.893 "state": "configuring", 00:29:24.893 "raid_level": "raid1", 00:29:24.893 "superblock": true, 00:29:24.893 "num_base_bdevs": 2, 00:29:24.893 "num_base_bdevs_discovered": 0, 00:29:24.893 "num_base_bdevs_operational": 2, 00:29:24.893 "base_bdevs_list": [ 00:29:24.893 { 00:29:24.893 "name": "BaseBdev1", 00:29:24.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:24.893 "is_configured": false, 00:29:24.893 "data_offset": 0, 00:29:24.893 "data_size": 0 00:29:24.893 }, 00:29:24.893 { 00:29:24.893 "name": "BaseBdev2", 00:29:24.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:24.893 "is_configured": false, 00:29:24.893 "data_offset": 0, 00:29:24.893 "data_size": 0 00:29:24.893 } 00:29:24.893 ] 00:29:24.893 }' 00:29:24.893 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:24.893 05:57:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:25.460 05:57:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:25.719 [2024-07-26 05:57:40.498192] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:25.719 [2024-07-26 05:57:40.498224] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x127ea80 name Existed_Raid, state configuring 00:29:25.719 05:57:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:25.977 [2024-07-26 05:57:40.738846] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:25.977 [2024-07-26 05:57:40.738875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:25.977 [2024-07-26 05:57:40.738885] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:25.977 [2024-07-26 05:57:40.738896] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:25.978 05:57:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:29:26.235 [2024-07-26 05:57:40.997963] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:26.235 BaseBdev1 00:29:26.235 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:26.235 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:29:26.235 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:26.235 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:29:26.235 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:26.235 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:26.235 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:26.492 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:26.751 [ 00:29:26.751 { 00:29:26.751 "name": "BaseBdev1", 00:29:26.751 "aliases": [ 00:29:26.751 "1ff29a09-41e5-4e9a-8ce8-c8d3b0bdf9ac" 00:29:26.751 ], 00:29:26.751 "product_name": "Malloc disk", 00:29:26.751 "block_size": 4096, 00:29:26.751 "num_blocks": 8192, 00:29:26.751 "uuid": "1ff29a09-41e5-4e9a-8ce8-c8d3b0bdf9ac", 00:29:26.751 "md_size": 32, 00:29:26.751 "md_interleave": false, 00:29:26.751 "dif_type": 0, 00:29:26.751 "assigned_rate_limits": { 00:29:26.751 "rw_ios_per_sec": 0, 00:29:26.751 "rw_mbytes_per_sec": 0, 00:29:26.751 "r_mbytes_per_sec": 0, 00:29:26.751 "w_mbytes_per_sec": 0 00:29:26.751 }, 00:29:26.751 "claimed": true, 00:29:26.751 "claim_type": "exclusive_write", 00:29:26.751 "zoned": false, 00:29:26.751 "supported_io_types": { 00:29:26.751 "read": true, 00:29:26.751 "write": true, 00:29:26.751 "unmap": true, 00:29:26.751 "flush": true, 00:29:26.751 "reset": true, 00:29:26.751 "nvme_admin": false, 00:29:26.751 "nvme_io": false, 00:29:26.751 "nvme_io_md": false, 00:29:26.751 "write_zeroes": true, 00:29:26.751 "zcopy": true, 00:29:26.751 "get_zone_info": false, 00:29:26.751 "zone_management": false, 00:29:26.751 "zone_append": false, 00:29:26.751 "compare": false, 00:29:26.751 "compare_and_write": false, 00:29:26.751 "abort": true, 00:29:26.751 "seek_hole": false, 00:29:26.751 "seek_data": false, 00:29:26.751 "copy": true, 00:29:26.751 "nvme_iov_md": false 00:29:26.751 }, 00:29:26.751 "memory_domains": [ 00:29:26.751 { 00:29:26.751 "dma_device_id": "system", 00:29:26.751 "dma_device_type": 1 00:29:26.751 }, 00:29:26.751 { 00:29:26.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:26.751 "dma_device_type": 2 00:29:26.751 } 00:29:26.751 ], 00:29:26.751 "driver_specific": {} 00:29:26.751 } 00:29:26.751 ] 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:26.751 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:27.010 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:27.010 "name": "Existed_Raid", 00:29:27.010 "uuid": "4b3a342a-d9bb-4f08-84e1-74dd75f9a906", 00:29:27.010 "strip_size_kb": 0, 00:29:27.010 "state": "configuring", 00:29:27.010 "raid_level": "raid1", 00:29:27.010 "superblock": true, 00:29:27.010 "num_base_bdevs": 2, 00:29:27.010 "num_base_bdevs_discovered": 1, 00:29:27.010 "num_base_bdevs_operational": 2, 00:29:27.010 "base_bdevs_list": [ 00:29:27.010 { 00:29:27.010 "name": "BaseBdev1", 00:29:27.010 "uuid": "1ff29a09-41e5-4e9a-8ce8-c8d3b0bdf9ac", 00:29:27.010 "is_configured": true, 00:29:27.010 "data_offset": 256, 00:29:27.010 "data_size": 7936 00:29:27.010 }, 00:29:27.010 { 00:29:27.010 "name": "BaseBdev2", 00:29:27.010 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:27.010 "is_configured": false, 00:29:27.010 "data_offset": 0, 00:29:27.010 "data_size": 0 00:29:27.010 } 00:29:27.010 ] 00:29:27.010 }' 00:29:27.010 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:27.010 05:57:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:27.947 05:57:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:27.947 [2024-07-26 05:57:42.850893] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:27.947 [2024-07-26 05:57:42.850933] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x127e350 name Existed_Raid, state configuring 00:29:28.206 05:57:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:28.206 [2024-07-26 05:57:43.095580] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:28.206 [2024-07-26 05:57:43.097003] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:28.206 [2024-07-26 05:57:43.097037] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:28.464 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:28.464 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:28.464 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:28.464 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:28.464 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:28.465 "name": "Existed_Raid", 00:29:28.465 "uuid": "cb51d24a-2859-40d4-9a95-b42731b87c53", 00:29:28.465 "strip_size_kb": 0, 00:29:28.465 "state": "configuring", 00:29:28.465 "raid_level": "raid1", 00:29:28.465 "superblock": true, 00:29:28.465 "num_base_bdevs": 2, 00:29:28.465 "num_base_bdevs_discovered": 1, 00:29:28.465 "num_base_bdevs_operational": 2, 00:29:28.465 "base_bdevs_list": [ 00:29:28.465 { 00:29:28.465 "name": "BaseBdev1", 00:29:28.465 "uuid": "1ff29a09-41e5-4e9a-8ce8-c8d3b0bdf9ac", 00:29:28.465 "is_configured": true, 00:29:28.465 "data_offset": 256, 00:29:28.465 "data_size": 7936 00:29:28.465 }, 00:29:28.465 { 00:29:28.465 "name": "BaseBdev2", 00:29:28.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:28.465 "is_configured": false, 00:29:28.465 "data_offset": 0, 00:29:28.465 "data_size": 0 00:29:28.465 } 00:29:28.465 ] 00:29:28.465 }' 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:28.465 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:29.398 05:57:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:29:29.398 [2024-07-26 05:57:44.190575] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:29.398 [2024-07-26 05:57:44.190724] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1280210 00:29:29.398 [2024-07-26 05:57:44.190738] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:29.398 [2024-07-26 05:57:44.190801] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x127fc50 00:29:29.398 [2024-07-26 05:57:44.190895] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1280210 00:29:29.398 [2024-07-26 05:57:44.190910] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1280210 00:29:29.398 [2024-07-26 05:57:44.190977] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:29.398 BaseBdev2 00:29:29.398 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:29.398 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:29:29.398 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:29.398 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:29:29.398 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:29.398 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:29.398 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:29.656 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:29.915 [ 00:29:29.915 { 00:29:29.915 "name": "BaseBdev2", 00:29:29.915 "aliases": [ 00:29:29.915 "d7dcc515-3de5-429a-b3f3-070c81120d95" 00:29:29.915 ], 00:29:29.915 "product_name": "Malloc disk", 00:29:29.915 "block_size": 4096, 00:29:29.915 "num_blocks": 8192, 00:29:29.915 "uuid": "d7dcc515-3de5-429a-b3f3-070c81120d95", 00:29:29.915 "md_size": 32, 00:29:29.915 "md_interleave": false, 00:29:29.915 "dif_type": 0, 00:29:29.915 "assigned_rate_limits": { 00:29:29.915 "rw_ios_per_sec": 0, 00:29:29.915 "rw_mbytes_per_sec": 0, 00:29:29.915 "r_mbytes_per_sec": 0, 00:29:29.915 "w_mbytes_per_sec": 0 00:29:29.915 }, 00:29:29.915 "claimed": true, 00:29:29.915 "claim_type": "exclusive_write", 00:29:29.915 "zoned": false, 00:29:29.915 "supported_io_types": { 00:29:29.915 "read": true, 00:29:29.915 "write": true, 00:29:29.915 "unmap": true, 00:29:29.915 "flush": true, 00:29:29.915 "reset": true, 00:29:29.915 "nvme_admin": false, 00:29:29.915 "nvme_io": false, 00:29:29.915 "nvme_io_md": false, 00:29:29.915 "write_zeroes": true, 00:29:29.915 "zcopy": true, 00:29:29.915 "get_zone_info": false, 00:29:29.915 "zone_management": false, 00:29:29.915 "zone_append": false, 00:29:29.915 "compare": false, 00:29:29.915 "compare_and_write": false, 00:29:29.915 "abort": true, 00:29:29.915 "seek_hole": false, 00:29:29.915 "seek_data": false, 00:29:29.915 "copy": true, 00:29:29.915 "nvme_iov_md": false 00:29:29.915 }, 00:29:29.915 "memory_domains": [ 00:29:29.915 { 00:29:29.915 "dma_device_id": "system", 00:29:29.915 "dma_device_type": 1 00:29:29.915 }, 00:29:29.915 { 00:29:29.915 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:29.915 "dma_device_type": 2 00:29:29.915 } 00:29:29.915 ], 00:29:29.915 "driver_specific": {} 00:29:29.915 } 00:29:29.915 ] 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.915 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:30.174 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:30.174 "name": "Existed_Raid", 00:29:30.174 "uuid": "cb51d24a-2859-40d4-9a95-b42731b87c53", 00:29:30.174 "strip_size_kb": 0, 00:29:30.174 "state": "online", 00:29:30.174 "raid_level": "raid1", 00:29:30.174 "superblock": true, 00:29:30.174 "num_base_bdevs": 2, 00:29:30.174 "num_base_bdevs_discovered": 2, 00:29:30.174 "num_base_bdevs_operational": 2, 00:29:30.174 "base_bdevs_list": [ 00:29:30.174 { 00:29:30.174 "name": "BaseBdev1", 00:29:30.174 "uuid": "1ff29a09-41e5-4e9a-8ce8-c8d3b0bdf9ac", 00:29:30.174 "is_configured": true, 00:29:30.174 "data_offset": 256, 00:29:30.174 "data_size": 7936 00:29:30.174 }, 00:29:30.174 { 00:29:30.174 "name": "BaseBdev2", 00:29:30.174 "uuid": "d7dcc515-3de5-429a-b3f3-070c81120d95", 00:29:30.174 "is_configured": true, 00:29:30.174 "data_offset": 256, 00:29:30.174 "data_size": 7936 00:29:30.174 } 00:29:30.174 ] 00:29:30.174 }' 00:29:30.174 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:30.174 05:57:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:31.109 05:57:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:31.109 05:57:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:31.109 05:57:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:31.109 05:57:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:31.109 05:57:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:31.109 05:57:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:31.109 05:57:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:31.109 05:57:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:31.368 [2024-07-26 05:57:46.031761] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:31.368 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:31.368 "name": "Existed_Raid", 00:29:31.368 "aliases": [ 00:29:31.368 "cb51d24a-2859-40d4-9a95-b42731b87c53" 00:29:31.368 ], 00:29:31.368 "product_name": "Raid Volume", 00:29:31.368 "block_size": 4096, 00:29:31.368 "num_blocks": 7936, 00:29:31.368 "uuid": "cb51d24a-2859-40d4-9a95-b42731b87c53", 00:29:31.368 "md_size": 32, 00:29:31.368 "md_interleave": false, 00:29:31.368 "dif_type": 0, 00:29:31.368 "assigned_rate_limits": { 00:29:31.368 "rw_ios_per_sec": 0, 00:29:31.368 "rw_mbytes_per_sec": 0, 00:29:31.368 "r_mbytes_per_sec": 0, 00:29:31.368 "w_mbytes_per_sec": 0 00:29:31.368 }, 00:29:31.368 "claimed": false, 00:29:31.368 "zoned": false, 00:29:31.368 "supported_io_types": { 00:29:31.368 "read": true, 00:29:31.368 "write": true, 00:29:31.368 "unmap": false, 00:29:31.368 "flush": false, 00:29:31.368 "reset": true, 00:29:31.368 "nvme_admin": false, 00:29:31.368 "nvme_io": false, 00:29:31.368 "nvme_io_md": false, 00:29:31.368 "write_zeroes": true, 00:29:31.368 "zcopy": false, 00:29:31.368 "get_zone_info": false, 00:29:31.368 "zone_management": false, 00:29:31.368 "zone_append": false, 00:29:31.368 "compare": false, 00:29:31.368 "compare_and_write": false, 00:29:31.368 "abort": false, 00:29:31.368 "seek_hole": false, 00:29:31.368 "seek_data": false, 00:29:31.368 "copy": false, 00:29:31.368 "nvme_iov_md": false 00:29:31.368 }, 00:29:31.368 "memory_domains": [ 00:29:31.368 { 00:29:31.368 "dma_device_id": "system", 00:29:31.368 "dma_device_type": 1 00:29:31.368 }, 00:29:31.368 { 00:29:31.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:31.368 "dma_device_type": 2 00:29:31.368 }, 00:29:31.368 { 00:29:31.368 "dma_device_id": "system", 00:29:31.368 "dma_device_type": 1 00:29:31.368 }, 00:29:31.368 { 00:29:31.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:31.368 "dma_device_type": 2 00:29:31.368 } 00:29:31.368 ], 00:29:31.368 "driver_specific": { 00:29:31.368 "raid": { 00:29:31.368 "uuid": "cb51d24a-2859-40d4-9a95-b42731b87c53", 00:29:31.368 "strip_size_kb": 0, 00:29:31.368 "state": "online", 00:29:31.368 "raid_level": "raid1", 00:29:31.368 "superblock": true, 00:29:31.368 "num_base_bdevs": 2, 00:29:31.368 "num_base_bdevs_discovered": 2, 00:29:31.368 "num_base_bdevs_operational": 2, 00:29:31.368 "base_bdevs_list": [ 00:29:31.368 { 00:29:31.368 "name": "BaseBdev1", 00:29:31.368 "uuid": "1ff29a09-41e5-4e9a-8ce8-c8d3b0bdf9ac", 00:29:31.368 "is_configured": true, 00:29:31.368 "data_offset": 256, 00:29:31.368 "data_size": 7936 00:29:31.368 }, 00:29:31.368 { 00:29:31.368 "name": "BaseBdev2", 00:29:31.368 "uuid": "d7dcc515-3de5-429a-b3f3-070c81120d95", 00:29:31.368 "is_configured": true, 00:29:31.368 "data_offset": 256, 00:29:31.368 "data_size": 7936 00:29:31.368 } 00:29:31.368 ] 00:29:31.368 } 00:29:31.368 } 00:29:31.368 }' 00:29:31.368 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:31.368 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:31.368 BaseBdev2' 00:29:31.368 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:31.368 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:31.368 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:31.627 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:31.627 "name": "BaseBdev1", 00:29:31.627 "aliases": [ 00:29:31.627 "1ff29a09-41e5-4e9a-8ce8-c8d3b0bdf9ac" 00:29:31.627 ], 00:29:31.627 "product_name": "Malloc disk", 00:29:31.627 "block_size": 4096, 00:29:31.627 "num_blocks": 8192, 00:29:31.627 "uuid": "1ff29a09-41e5-4e9a-8ce8-c8d3b0bdf9ac", 00:29:31.627 "md_size": 32, 00:29:31.627 "md_interleave": false, 00:29:31.627 "dif_type": 0, 00:29:31.627 "assigned_rate_limits": { 00:29:31.627 "rw_ios_per_sec": 0, 00:29:31.627 "rw_mbytes_per_sec": 0, 00:29:31.627 "r_mbytes_per_sec": 0, 00:29:31.627 "w_mbytes_per_sec": 0 00:29:31.627 }, 00:29:31.627 "claimed": true, 00:29:31.627 "claim_type": "exclusive_write", 00:29:31.627 "zoned": false, 00:29:31.627 "supported_io_types": { 00:29:31.627 "read": true, 00:29:31.627 "write": true, 00:29:31.627 "unmap": true, 00:29:31.627 "flush": true, 00:29:31.627 "reset": true, 00:29:31.627 "nvme_admin": false, 00:29:31.627 "nvme_io": false, 00:29:31.627 "nvme_io_md": false, 00:29:31.627 "write_zeroes": true, 00:29:31.627 "zcopy": true, 00:29:31.627 "get_zone_info": false, 00:29:31.627 "zone_management": false, 00:29:31.627 "zone_append": false, 00:29:31.627 "compare": false, 00:29:31.627 "compare_and_write": false, 00:29:31.627 "abort": true, 00:29:31.627 "seek_hole": false, 00:29:31.627 "seek_data": false, 00:29:31.627 "copy": true, 00:29:31.627 "nvme_iov_md": false 00:29:31.627 }, 00:29:31.627 "memory_domains": [ 00:29:31.627 { 00:29:31.627 "dma_device_id": "system", 00:29:31.627 "dma_device_type": 1 00:29:31.627 }, 00:29:31.627 { 00:29:31.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:31.627 "dma_device_type": 2 00:29:31.627 } 00:29:31.627 ], 00:29:31.627 "driver_specific": {} 00:29:31.627 }' 00:29:31.627 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:31.627 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:31.627 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:31.627 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:31.627 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:31.885 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:32.144 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:32.144 "name": "BaseBdev2", 00:29:32.144 "aliases": [ 00:29:32.144 "d7dcc515-3de5-429a-b3f3-070c81120d95" 00:29:32.144 ], 00:29:32.144 "product_name": "Malloc disk", 00:29:32.144 "block_size": 4096, 00:29:32.144 "num_blocks": 8192, 00:29:32.144 "uuid": "d7dcc515-3de5-429a-b3f3-070c81120d95", 00:29:32.144 "md_size": 32, 00:29:32.144 "md_interleave": false, 00:29:32.144 "dif_type": 0, 00:29:32.144 "assigned_rate_limits": { 00:29:32.144 "rw_ios_per_sec": 0, 00:29:32.144 "rw_mbytes_per_sec": 0, 00:29:32.144 "r_mbytes_per_sec": 0, 00:29:32.144 "w_mbytes_per_sec": 0 00:29:32.144 }, 00:29:32.144 "claimed": true, 00:29:32.144 "claim_type": "exclusive_write", 00:29:32.144 "zoned": false, 00:29:32.144 "supported_io_types": { 00:29:32.144 "read": true, 00:29:32.144 "write": true, 00:29:32.144 "unmap": true, 00:29:32.144 "flush": true, 00:29:32.144 "reset": true, 00:29:32.144 "nvme_admin": false, 00:29:32.144 "nvme_io": false, 00:29:32.144 "nvme_io_md": false, 00:29:32.144 "write_zeroes": true, 00:29:32.144 "zcopy": true, 00:29:32.144 "get_zone_info": false, 00:29:32.144 "zone_management": false, 00:29:32.144 "zone_append": false, 00:29:32.144 "compare": false, 00:29:32.144 "compare_and_write": false, 00:29:32.144 "abort": true, 00:29:32.144 "seek_hole": false, 00:29:32.144 "seek_data": false, 00:29:32.144 "copy": true, 00:29:32.144 "nvme_iov_md": false 00:29:32.144 }, 00:29:32.144 "memory_domains": [ 00:29:32.144 { 00:29:32.144 "dma_device_id": "system", 00:29:32.144 "dma_device_type": 1 00:29:32.144 }, 00:29:32.144 { 00:29:32.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:32.144 "dma_device_type": 2 00:29:32.144 } 00:29:32.144 ], 00:29:32.144 "driver_specific": {} 00:29:32.144 }' 00:29:32.144 05:57:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:32.144 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:32.403 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:32.403 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:32.403 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:32.403 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:32.403 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:32.403 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:32.662 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:32.662 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:32.662 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:32.662 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:32.662 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:32.959 [2024-07-26 05:57:47.643818] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:32.959 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:33.237 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:33.237 "name": "Existed_Raid", 00:29:33.237 "uuid": "cb51d24a-2859-40d4-9a95-b42731b87c53", 00:29:33.237 "strip_size_kb": 0, 00:29:33.237 "state": "online", 00:29:33.237 "raid_level": "raid1", 00:29:33.237 "superblock": true, 00:29:33.237 "num_base_bdevs": 2, 00:29:33.237 "num_base_bdevs_discovered": 1, 00:29:33.237 "num_base_bdevs_operational": 1, 00:29:33.237 "base_bdevs_list": [ 00:29:33.237 { 00:29:33.237 "name": null, 00:29:33.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:33.237 "is_configured": false, 00:29:33.237 "data_offset": 256, 00:29:33.237 "data_size": 7936 00:29:33.237 }, 00:29:33.237 { 00:29:33.237 "name": "BaseBdev2", 00:29:33.237 "uuid": "d7dcc515-3de5-429a-b3f3-070c81120d95", 00:29:33.237 "is_configured": true, 00:29:33.237 "data_offset": 256, 00:29:33.237 "data_size": 7936 00:29:33.237 } 00:29:33.237 ] 00:29:33.237 }' 00:29:33.237 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:33.237 05:57:47 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:33.804 05:57:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:33.804 05:57:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:33.804 05:57:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:33.804 05:57:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.062 05:57:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:34.062 05:57:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:34.062 05:57:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:34.321 [2024-07-26 05:57:49.078842] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:34.321 [2024-07-26 05:57:49.078928] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:34.321 [2024-07-26 05:57:49.092331] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:34.321 [2024-07-26 05:57:49.092364] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:34.321 [2024-07-26 05:57:49.092375] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1280210 name Existed_Raid, state offline 00:29:34.321 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:34.321 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:34.321 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:34.321 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1273981 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1273981 ']' 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1273981 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1273981 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1273981' 00:29:34.580 killing process with pid 1273981 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1273981 00:29:34.580 [2024-07-26 05:57:49.417521] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:34.580 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1273981 00:29:34.580 [2024-07-26 05:57:49.418434] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:34.840 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:29:34.840 00:29:34.840 real 0m11.435s 00:29:34.840 user 0m20.346s 00:29:34.840 sys 0m2.149s 00:29:34.840 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:34.840 05:57:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:34.840 ************************************ 00:29:34.840 END TEST raid_state_function_test_sb_md_separate 00:29:34.840 ************************************ 00:29:34.840 05:57:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:34.840 05:57:49 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:29:34.840 05:57:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:29:34.840 05:57:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:34.840 05:57:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:34.840 ************************************ 00:29:34.840 START TEST raid_superblock_test_md_separate 00:29:34.840 ************************************ 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=1275618 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 1275618 /var/tmp/spdk-raid.sock 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1275618 ']' 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:34.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:34.840 05:57:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:35.099 [2024-07-26 05:57:49.786885] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:29:35.099 [2024-07-26 05:57:49.786962] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1275618 ] 00:29:35.099 [2024-07-26 05:57:49.919913] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:35.357 [2024-07-26 05:57:50.029179] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:35.357 [2024-07-26 05:57:50.098317] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:35.357 [2024-07-26 05:57:50.098352] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:29:35.922 malloc1 00:29:35.922 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:36.179 [2024-07-26 05:57:50.921863] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:36.179 [2024-07-26 05:57:50.921910] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:36.179 [2024-07-26 05:57:50.921932] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19da830 00:29:36.179 [2024-07-26 05:57:50.921946] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:36.179 [2024-07-26 05:57:50.923506] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:36.179 [2024-07-26 05:57:50.923533] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:36.179 pt1 00:29:36.179 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:36.179 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:36.180 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:29:36.180 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:29:36.180 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:36.180 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:36.180 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:36.180 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:36.180 05:57:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:29:36.437 malloc2 00:29:36.437 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:36.437 [2024-07-26 05:57:51.280303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:36.437 [2024-07-26 05:57:51.280355] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:36.437 [2024-07-26 05:57:51.280376] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19cc250 00:29:36.437 [2024-07-26 05:57:51.280388] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:36.437 [2024-07-26 05:57:51.281727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:36.437 [2024-07-26 05:57:51.281757] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:36.437 pt2 00:29:36.437 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:36.437 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:36.437 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:36.695 [2024-07-26 05:57:51.524987] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:36.695 [2024-07-26 05:57:51.526352] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:36.695 [2024-07-26 05:57:51.526493] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19ccd20 00:29:36.695 [2024-07-26 05:57:51.526506] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:36.695 [2024-07-26 05:57:51.526582] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19c0a60 00:29:36.695 [2024-07-26 05:57:51.526710] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19ccd20 00:29:36.695 [2024-07-26 05:57:51.526721] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19ccd20 00:29:36.695 [2024-07-26 05:57:51.526792] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:36.695 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:36.695 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:36.695 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:36.695 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:36.695 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:36.695 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:36.695 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:36.695 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:36.695 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:36.696 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:36.696 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.696 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:36.954 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:36.954 "name": "raid_bdev1", 00:29:36.954 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:36.954 "strip_size_kb": 0, 00:29:36.954 "state": "online", 00:29:36.954 "raid_level": "raid1", 00:29:36.954 "superblock": true, 00:29:36.954 "num_base_bdevs": 2, 00:29:36.954 "num_base_bdevs_discovered": 2, 00:29:36.954 "num_base_bdevs_operational": 2, 00:29:36.954 "base_bdevs_list": [ 00:29:36.954 { 00:29:36.954 "name": "pt1", 00:29:36.954 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:36.954 "is_configured": true, 00:29:36.954 "data_offset": 256, 00:29:36.954 "data_size": 7936 00:29:36.954 }, 00:29:36.954 { 00:29:36.954 "name": "pt2", 00:29:36.954 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:36.954 "is_configured": true, 00:29:36.954 "data_offset": 256, 00:29:36.954 "data_size": 7936 00:29:36.954 } 00:29:36.954 ] 00:29:36.954 }' 00:29:36.954 05:57:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:36.954 05:57:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:37.519 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:29:37.519 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:37.519 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:37.519 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:37.519 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:37.519 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:37.519 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:37.519 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:37.777 [2024-07-26 05:57:52.612098] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:37.777 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:37.777 "name": "raid_bdev1", 00:29:37.777 "aliases": [ 00:29:37.777 "bf2fc835-cff4-49fc-bde5-bd2402e0f05e" 00:29:37.777 ], 00:29:37.777 "product_name": "Raid Volume", 00:29:37.777 "block_size": 4096, 00:29:37.777 "num_blocks": 7936, 00:29:37.777 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:37.777 "md_size": 32, 00:29:37.777 "md_interleave": false, 00:29:37.777 "dif_type": 0, 00:29:37.777 "assigned_rate_limits": { 00:29:37.777 "rw_ios_per_sec": 0, 00:29:37.777 "rw_mbytes_per_sec": 0, 00:29:37.777 "r_mbytes_per_sec": 0, 00:29:37.777 "w_mbytes_per_sec": 0 00:29:37.777 }, 00:29:37.777 "claimed": false, 00:29:37.777 "zoned": false, 00:29:37.777 "supported_io_types": { 00:29:37.777 "read": true, 00:29:37.777 "write": true, 00:29:37.777 "unmap": false, 00:29:37.777 "flush": false, 00:29:37.777 "reset": true, 00:29:37.777 "nvme_admin": false, 00:29:37.777 "nvme_io": false, 00:29:37.777 "nvme_io_md": false, 00:29:37.777 "write_zeroes": true, 00:29:37.777 "zcopy": false, 00:29:37.777 "get_zone_info": false, 00:29:37.777 "zone_management": false, 00:29:37.777 "zone_append": false, 00:29:37.777 "compare": false, 00:29:37.777 "compare_and_write": false, 00:29:37.777 "abort": false, 00:29:37.777 "seek_hole": false, 00:29:37.777 "seek_data": false, 00:29:37.777 "copy": false, 00:29:37.777 "nvme_iov_md": false 00:29:37.777 }, 00:29:37.777 "memory_domains": [ 00:29:37.777 { 00:29:37.777 "dma_device_id": "system", 00:29:37.777 "dma_device_type": 1 00:29:37.777 }, 00:29:37.777 { 00:29:37.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:37.777 "dma_device_type": 2 00:29:37.777 }, 00:29:37.777 { 00:29:37.777 "dma_device_id": "system", 00:29:37.777 "dma_device_type": 1 00:29:37.777 }, 00:29:37.777 { 00:29:37.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:37.777 "dma_device_type": 2 00:29:37.777 } 00:29:37.777 ], 00:29:37.777 "driver_specific": { 00:29:37.777 "raid": { 00:29:37.777 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:37.777 "strip_size_kb": 0, 00:29:37.777 "state": "online", 00:29:37.777 "raid_level": "raid1", 00:29:37.777 "superblock": true, 00:29:37.777 "num_base_bdevs": 2, 00:29:37.777 "num_base_bdevs_discovered": 2, 00:29:37.777 "num_base_bdevs_operational": 2, 00:29:37.777 "base_bdevs_list": [ 00:29:37.777 { 00:29:37.777 "name": "pt1", 00:29:37.777 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:37.777 "is_configured": true, 00:29:37.777 "data_offset": 256, 00:29:37.777 "data_size": 7936 00:29:37.777 }, 00:29:37.777 { 00:29:37.777 "name": "pt2", 00:29:37.777 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:37.777 "is_configured": true, 00:29:37.777 "data_offset": 256, 00:29:37.777 "data_size": 7936 00:29:37.777 } 00:29:37.777 ] 00:29:37.777 } 00:29:37.777 } 00:29:37.777 }' 00:29:37.777 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:37.777 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:37.777 pt2' 00:29:37.777 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:37.777 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:37.777 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:38.035 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:38.035 "name": "pt1", 00:29:38.035 "aliases": [ 00:29:38.035 "00000000-0000-0000-0000-000000000001" 00:29:38.035 ], 00:29:38.035 "product_name": "passthru", 00:29:38.035 "block_size": 4096, 00:29:38.035 "num_blocks": 8192, 00:29:38.035 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:38.035 "md_size": 32, 00:29:38.035 "md_interleave": false, 00:29:38.035 "dif_type": 0, 00:29:38.035 "assigned_rate_limits": { 00:29:38.035 "rw_ios_per_sec": 0, 00:29:38.035 "rw_mbytes_per_sec": 0, 00:29:38.035 "r_mbytes_per_sec": 0, 00:29:38.035 "w_mbytes_per_sec": 0 00:29:38.035 }, 00:29:38.035 "claimed": true, 00:29:38.035 "claim_type": "exclusive_write", 00:29:38.035 "zoned": false, 00:29:38.035 "supported_io_types": { 00:29:38.035 "read": true, 00:29:38.035 "write": true, 00:29:38.035 "unmap": true, 00:29:38.035 "flush": true, 00:29:38.035 "reset": true, 00:29:38.035 "nvme_admin": false, 00:29:38.035 "nvme_io": false, 00:29:38.035 "nvme_io_md": false, 00:29:38.035 "write_zeroes": true, 00:29:38.035 "zcopy": true, 00:29:38.035 "get_zone_info": false, 00:29:38.035 "zone_management": false, 00:29:38.035 "zone_append": false, 00:29:38.035 "compare": false, 00:29:38.035 "compare_and_write": false, 00:29:38.036 "abort": true, 00:29:38.036 "seek_hole": false, 00:29:38.036 "seek_data": false, 00:29:38.036 "copy": true, 00:29:38.036 "nvme_iov_md": false 00:29:38.036 }, 00:29:38.036 "memory_domains": [ 00:29:38.036 { 00:29:38.036 "dma_device_id": "system", 00:29:38.036 "dma_device_type": 1 00:29:38.036 }, 00:29:38.036 { 00:29:38.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:38.036 "dma_device_type": 2 00:29:38.036 } 00:29:38.036 ], 00:29:38.036 "driver_specific": { 00:29:38.036 "passthru": { 00:29:38.036 "name": "pt1", 00:29:38.036 "base_bdev_name": "malloc1" 00:29:38.036 } 00:29:38.036 } 00:29:38.036 }' 00:29:38.036 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:38.294 05:57:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:38.294 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:38.294 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.294 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.294 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:38.294 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:38.294 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:38.294 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:38.294 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:38.555 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:38.556 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:38.556 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:38.556 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:38.556 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:38.819 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:38.819 "name": "pt2", 00:29:38.819 "aliases": [ 00:29:38.819 "00000000-0000-0000-0000-000000000002" 00:29:38.819 ], 00:29:38.819 "product_name": "passthru", 00:29:38.819 "block_size": 4096, 00:29:38.819 "num_blocks": 8192, 00:29:38.819 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:38.819 "md_size": 32, 00:29:38.819 "md_interleave": false, 00:29:38.819 "dif_type": 0, 00:29:38.819 "assigned_rate_limits": { 00:29:38.819 "rw_ios_per_sec": 0, 00:29:38.819 "rw_mbytes_per_sec": 0, 00:29:38.819 "r_mbytes_per_sec": 0, 00:29:38.819 "w_mbytes_per_sec": 0 00:29:38.819 }, 00:29:38.819 "claimed": true, 00:29:38.819 "claim_type": "exclusive_write", 00:29:38.819 "zoned": false, 00:29:38.819 "supported_io_types": { 00:29:38.819 "read": true, 00:29:38.819 "write": true, 00:29:38.819 "unmap": true, 00:29:38.819 "flush": true, 00:29:38.819 "reset": true, 00:29:38.819 "nvme_admin": false, 00:29:38.819 "nvme_io": false, 00:29:38.819 "nvme_io_md": false, 00:29:38.819 "write_zeroes": true, 00:29:38.819 "zcopy": true, 00:29:38.819 "get_zone_info": false, 00:29:38.819 "zone_management": false, 00:29:38.819 "zone_append": false, 00:29:38.819 "compare": false, 00:29:38.819 "compare_and_write": false, 00:29:38.819 "abort": true, 00:29:38.819 "seek_hole": false, 00:29:38.819 "seek_data": false, 00:29:38.819 "copy": true, 00:29:38.819 "nvme_iov_md": false 00:29:38.819 }, 00:29:38.819 "memory_domains": [ 00:29:38.819 { 00:29:38.819 "dma_device_id": "system", 00:29:38.819 "dma_device_type": 1 00:29:38.819 }, 00:29:38.819 { 00:29:38.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:38.819 "dma_device_type": 2 00:29:38.819 } 00:29:38.819 ], 00:29:38.819 "driver_specific": { 00:29:38.819 "passthru": { 00:29:38.819 "name": "pt2", 00:29:38.819 "base_bdev_name": "malloc2" 00:29:38.819 } 00:29:38.819 } 00:29:38.819 }' 00:29:38.819 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:38.819 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:38.819 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:38.819 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.819 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:38.819 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:38.819 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:39.077 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:39.077 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:39.077 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:39.077 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:39.077 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:39.077 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:39.077 05:57:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:29:39.335 [2024-07-26 05:57:54.112093] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:39.335 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=bf2fc835-cff4-49fc-bde5-bd2402e0f05e 00:29:39.335 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z bf2fc835-cff4-49fc-bde5-bd2402e0f05e ']' 00:29:39.335 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:39.593 [2024-07-26 05:57:54.360487] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:39.593 [2024-07-26 05:57:54.360509] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:39.593 [2024-07-26 05:57:54.360570] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:39.593 [2024-07-26 05:57:54.360624] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:39.593 [2024-07-26 05:57:54.360646] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19ccd20 name raid_bdev1, state offline 00:29:39.593 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.593 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:29:39.851 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:29:39.851 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:29:39.851 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:39.851 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:40.108 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:40.108 05:57:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:40.366 05:57:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:40.366 05:57:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:40.624 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:40.882 [2024-07-26 05:57:55.595702] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:40.882 [2024-07-26 05:57:55.597046] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:40.882 [2024-07-26 05:57:55.597104] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:40.882 [2024-07-26 05:57:55.597146] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:40.882 [2024-07-26 05:57:55.597164] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:40.882 [2024-07-26 05:57:55.597174] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x183ced0 name raid_bdev1, state configuring 00:29:40.882 request: 00:29:40.882 { 00:29:40.882 "name": "raid_bdev1", 00:29:40.882 "raid_level": "raid1", 00:29:40.882 "base_bdevs": [ 00:29:40.882 "malloc1", 00:29:40.882 "malloc2" 00:29:40.882 ], 00:29:40.882 "superblock": false, 00:29:40.882 "method": "bdev_raid_create", 00:29:40.882 "req_id": 1 00:29:40.882 } 00:29:40.882 Got JSON-RPC error response 00:29:40.882 response: 00:29:40.882 { 00:29:40.882 "code": -17, 00:29:40.882 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:40.882 } 00:29:40.882 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:29:40.882 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:40.882 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:40.882 05:57:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:40.882 05:57:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.882 05:57:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:29:41.139 05:57:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:29:41.140 05:57:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:29:41.140 05:57:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:41.398 [2024-07-26 05:57:56.080933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:41.398 [2024-07-26 05:57:56.080984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:41.398 [2024-07-26 05:57:56.081004] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19daee0 00:29:41.398 [2024-07-26 05:57:56.081017] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:41.398 [2024-07-26 05:57:56.082520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:41.398 [2024-07-26 05:57:56.082548] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:41.398 [2024-07-26 05:57:56.082600] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:41.398 [2024-07-26 05:57:56.082627] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:41.398 pt1 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:41.398 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:41.656 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:41.656 "name": "raid_bdev1", 00:29:41.656 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:41.656 "strip_size_kb": 0, 00:29:41.656 "state": "configuring", 00:29:41.656 "raid_level": "raid1", 00:29:41.656 "superblock": true, 00:29:41.656 "num_base_bdevs": 2, 00:29:41.656 "num_base_bdevs_discovered": 1, 00:29:41.656 "num_base_bdevs_operational": 2, 00:29:41.656 "base_bdevs_list": [ 00:29:41.656 { 00:29:41.656 "name": "pt1", 00:29:41.656 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:41.656 "is_configured": true, 00:29:41.656 "data_offset": 256, 00:29:41.656 "data_size": 7936 00:29:41.656 }, 00:29:41.656 { 00:29:41.656 "name": null, 00:29:41.656 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:41.656 "is_configured": false, 00:29:41.656 "data_offset": 256, 00:29:41.656 "data_size": 7936 00:29:41.656 } 00:29:41.656 ] 00:29:41.656 }' 00:29:41.656 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:41.656 05:57:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:42.222 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:29:42.222 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:29:42.222 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:42.222 05:57:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:42.480 [2024-07-26 05:57:57.184018] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:42.480 [2024-07-26 05:57:57.184068] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:42.480 [2024-07-26 05:57:57.184088] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x183d490 00:29:42.480 [2024-07-26 05:57:57.184100] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:42.480 [2024-07-26 05:57:57.184303] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:42.480 [2024-07-26 05:57:57.184321] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:42.480 [2024-07-26 05:57:57.184365] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:42.480 [2024-07-26 05:57:57.184386] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:42.481 [2024-07-26 05:57:57.184479] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19c15d0 00:29:42.481 [2024-07-26 05:57:57.184489] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:42.481 [2024-07-26 05:57:57.184546] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19c2800 00:29:42.481 [2024-07-26 05:57:57.184654] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19c15d0 00:29:42.481 [2024-07-26 05:57:57.184665] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19c15d0 00:29:42.481 [2024-07-26 05:57:57.184735] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:42.481 pt2 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.481 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:42.739 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:42.739 "name": "raid_bdev1", 00:29:42.739 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:42.739 "strip_size_kb": 0, 00:29:42.739 "state": "online", 00:29:42.739 "raid_level": "raid1", 00:29:42.739 "superblock": true, 00:29:42.739 "num_base_bdevs": 2, 00:29:42.739 "num_base_bdevs_discovered": 2, 00:29:42.739 "num_base_bdevs_operational": 2, 00:29:42.739 "base_bdevs_list": [ 00:29:42.739 { 00:29:42.739 "name": "pt1", 00:29:42.739 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:42.739 "is_configured": true, 00:29:42.739 "data_offset": 256, 00:29:42.739 "data_size": 7936 00:29:42.739 }, 00:29:42.739 { 00:29:42.739 "name": "pt2", 00:29:42.739 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:42.739 "is_configured": true, 00:29:42.739 "data_offset": 256, 00:29:42.739 "data_size": 7936 00:29:42.739 } 00:29:42.739 ] 00:29:42.739 }' 00:29:42.739 05:57:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:42.739 05:57:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:43.305 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:29:43.305 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:43.305 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:43.305 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:43.305 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:43.305 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:43.305 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:43.305 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:43.564 [2024-07-26 05:57:58.291203] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:43.564 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:43.564 "name": "raid_bdev1", 00:29:43.564 "aliases": [ 00:29:43.564 "bf2fc835-cff4-49fc-bde5-bd2402e0f05e" 00:29:43.564 ], 00:29:43.564 "product_name": "Raid Volume", 00:29:43.564 "block_size": 4096, 00:29:43.564 "num_blocks": 7936, 00:29:43.564 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:43.564 "md_size": 32, 00:29:43.564 "md_interleave": false, 00:29:43.564 "dif_type": 0, 00:29:43.564 "assigned_rate_limits": { 00:29:43.564 "rw_ios_per_sec": 0, 00:29:43.564 "rw_mbytes_per_sec": 0, 00:29:43.564 "r_mbytes_per_sec": 0, 00:29:43.564 "w_mbytes_per_sec": 0 00:29:43.564 }, 00:29:43.564 "claimed": false, 00:29:43.564 "zoned": false, 00:29:43.564 "supported_io_types": { 00:29:43.564 "read": true, 00:29:43.564 "write": true, 00:29:43.564 "unmap": false, 00:29:43.564 "flush": false, 00:29:43.564 "reset": true, 00:29:43.564 "nvme_admin": false, 00:29:43.564 "nvme_io": false, 00:29:43.564 "nvme_io_md": false, 00:29:43.564 "write_zeroes": true, 00:29:43.564 "zcopy": false, 00:29:43.564 "get_zone_info": false, 00:29:43.564 "zone_management": false, 00:29:43.564 "zone_append": false, 00:29:43.564 "compare": false, 00:29:43.564 "compare_and_write": false, 00:29:43.564 "abort": false, 00:29:43.564 "seek_hole": false, 00:29:43.564 "seek_data": false, 00:29:43.564 "copy": false, 00:29:43.564 "nvme_iov_md": false 00:29:43.564 }, 00:29:43.564 "memory_domains": [ 00:29:43.564 { 00:29:43.564 "dma_device_id": "system", 00:29:43.564 "dma_device_type": 1 00:29:43.564 }, 00:29:43.564 { 00:29:43.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:43.564 "dma_device_type": 2 00:29:43.564 }, 00:29:43.564 { 00:29:43.564 "dma_device_id": "system", 00:29:43.564 "dma_device_type": 1 00:29:43.564 }, 00:29:43.564 { 00:29:43.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:43.564 "dma_device_type": 2 00:29:43.564 } 00:29:43.564 ], 00:29:43.564 "driver_specific": { 00:29:43.564 "raid": { 00:29:43.564 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:43.564 "strip_size_kb": 0, 00:29:43.564 "state": "online", 00:29:43.564 "raid_level": "raid1", 00:29:43.564 "superblock": true, 00:29:43.564 "num_base_bdevs": 2, 00:29:43.564 "num_base_bdevs_discovered": 2, 00:29:43.564 "num_base_bdevs_operational": 2, 00:29:43.564 "base_bdevs_list": [ 00:29:43.564 { 00:29:43.564 "name": "pt1", 00:29:43.564 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:43.564 "is_configured": true, 00:29:43.564 "data_offset": 256, 00:29:43.564 "data_size": 7936 00:29:43.564 }, 00:29:43.564 { 00:29:43.564 "name": "pt2", 00:29:43.564 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:43.564 "is_configured": true, 00:29:43.564 "data_offset": 256, 00:29:43.564 "data_size": 7936 00:29:43.564 } 00:29:43.564 ] 00:29:43.564 } 00:29:43.564 } 00:29:43.564 }' 00:29:43.564 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:43.564 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:43.564 pt2' 00:29:43.564 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:43.564 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:43.564 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:43.823 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:43.823 "name": "pt1", 00:29:43.823 "aliases": [ 00:29:43.823 "00000000-0000-0000-0000-000000000001" 00:29:43.823 ], 00:29:43.823 "product_name": "passthru", 00:29:43.823 "block_size": 4096, 00:29:43.823 "num_blocks": 8192, 00:29:43.823 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:43.823 "md_size": 32, 00:29:43.823 "md_interleave": false, 00:29:43.823 "dif_type": 0, 00:29:43.823 "assigned_rate_limits": { 00:29:43.823 "rw_ios_per_sec": 0, 00:29:43.823 "rw_mbytes_per_sec": 0, 00:29:43.823 "r_mbytes_per_sec": 0, 00:29:43.823 "w_mbytes_per_sec": 0 00:29:43.823 }, 00:29:43.823 "claimed": true, 00:29:43.823 "claim_type": "exclusive_write", 00:29:43.823 "zoned": false, 00:29:43.823 "supported_io_types": { 00:29:43.823 "read": true, 00:29:43.823 "write": true, 00:29:43.823 "unmap": true, 00:29:43.823 "flush": true, 00:29:43.823 "reset": true, 00:29:43.823 "nvme_admin": false, 00:29:43.823 "nvme_io": false, 00:29:43.823 "nvme_io_md": false, 00:29:43.823 "write_zeroes": true, 00:29:43.823 "zcopy": true, 00:29:43.823 "get_zone_info": false, 00:29:43.823 "zone_management": false, 00:29:43.823 "zone_append": false, 00:29:43.823 "compare": false, 00:29:43.823 "compare_and_write": false, 00:29:43.823 "abort": true, 00:29:43.823 "seek_hole": false, 00:29:43.823 "seek_data": false, 00:29:43.823 "copy": true, 00:29:43.823 "nvme_iov_md": false 00:29:43.823 }, 00:29:43.823 "memory_domains": [ 00:29:43.823 { 00:29:43.823 "dma_device_id": "system", 00:29:43.823 "dma_device_type": 1 00:29:43.823 }, 00:29:43.823 { 00:29:43.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:43.823 "dma_device_type": 2 00:29:43.823 } 00:29:43.823 ], 00:29:43.823 "driver_specific": { 00:29:43.823 "passthru": { 00:29:43.823 "name": "pt1", 00:29:43.823 "base_bdev_name": "malloc1" 00:29:43.823 } 00:29:43.823 } 00:29:43.823 }' 00:29:43.823 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:43.823 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:43.823 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:43.823 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:43.823 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:44.081 05:57:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:44.339 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:44.339 "name": "pt2", 00:29:44.339 "aliases": [ 00:29:44.339 "00000000-0000-0000-0000-000000000002" 00:29:44.339 ], 00:29:44.339 "product_name": "passthru", 00:29:44.339 "block_size": 4096, 00:29:44.339 "num_blocks": 8192, 00:29:44.339 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:44.339 "md_size": 32, 00:29:44.339 "md_interleave": false, 00:29:44.339 "dif_type": 0, 00:29:44.339 "assigned_rate_limits": { 00:29:44.339 "rw_ios_per_sec": 0, 00:29:44.339 "rw_mbytes_per_sec": 0, 00:29:44.339 "r_mbytes_per_sec": 0, 00:29:44.339 "w_mbytes_per_sec": 0 00:29:44.339 }, 00:29:44.339 "claimed": true, 00:29:44.339 "claim_type": "exclusive_write", 00:29:44.339 "zoned": false, 00:29:44.339 "supported_io_types": { 00:29:44.339 "read": true, 00:29:44.339 "write": true, 00:29:44.339 "unmap": true, 00:29:44.339 "flush": true, 00:29:44.339 "reset": true, 00:29:44.339 "nvme_admin": false, 00:29:44.339 "nvme_io": false, 00:29:44.339 "nvme_io_md": false, 00:29:44.339 "write_zeroes": true, 00:29:44.339 "zcopy": true, 00:29:44.340 "get_zone_info": false, 00:29:44.340 "zone_management": false, 00:29:44.340 "zone_append": false, 00:29:44.340 "compare": false, 00:29:44.340 "compare_and_write": false, 00:29:44.340 "abort": true, 00:29:44.340 "seek_hole": false, 00:29:44.340 "seek_data": false, 00:29:44.340 "copy": true, 00:29:44.340 "nvme_iov_md": false 00:29:44.340 }, 00:29:44.340 "memory_domains": [ 00:29:44.340 { 00:29:44.340 "dma_device_id": "system", 00:29:44.340 "dma_device_type": 1 00:29:44.340 }, 00:29:44.340 { 00:29:44.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:44.340 "dma_device_type": 2 00:29:44.340 } 00:29:44.340 ], 00:29:44.340 "driver_specific": { 00:29:44.340 "passthru": { 00:29:44.340 "name": "pt2", 00:29:44.340 "base_bdev_name": "malloc2" 00:29:44.340 } 00:29:44.340 } 00:29:44.340 }' 00:29:44.340 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:44.598 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:44.598 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:44.598 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:44.598 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:44.598 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:44.598 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:44.598 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:44.598 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:44.598 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:44.856 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:44.856 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:44.856 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:44.856 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:29:45.114 [2024-07-26 05:57:59.807243] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:45.114 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' bf2fc835-cff4-49fc-bde5-bd2402e0f05e '!=' bf2fc835-cff4-49fc-bde5-bd2402e0f05e ']' 00:29:45.114 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:29:45.114 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:45.114 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:29:45.114 05:57:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:45.372 [2024-07-26 05:58:00.083748] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.373 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.631 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.631 "name": "raid_bdev1", 00:29:45.631 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:45.631 "strip_size_kb": 0, 00:29:45.631 "state": "online", 00:29:45.631 "raid_level": "raid1", 00:29:45.631 "superblock": true, 00:29:45.631 "num_base_bdevs": 2, 00:29:45.631 "num_base_bdevs_discovered": 1, 00:29:45.631 "num_base_bdevs_operational": 1, 00:29:45.631 "base_bdevs_list": [ 00:29:45.631 { 00:29:45.631 "name": null, 00:29:45.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.631 "is_configured": false, 00:29:45.631 "data_offset": 256, 00:29:45.631 "data_size": 7936 00:29:45.631 }, 00:29:45.631 { 00:29:45.631 "name": "pt2", 00:29:45.631 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:45.631 "is_configured": true, 00:29:45.631 "data_offset": 256, 00:29:45.631 "data_size": 7936 00:29:45.631 } 00:29:45.631 ] 00:29:45.631 }' 00:29:45.631 05:58:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.631 05:58:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:46.564 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:46.564 [2024-07-26 05:58:01.427287] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:46.564 [2024-07-26 05:58:01.427315] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:46.564 [2024-07-26 05:58:01.427374] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:46.564 [2024-07-26 05:58:01.427420] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:46.564 [2024-07-26 05:58:01.427432] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19c15d0 name raid_bdev1, state offline 00:29:46.564 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.564 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:29:46.822 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:29:46.823 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:29:46.823 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:29:46.823 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:46.823 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:47.080 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:29:47.080 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:47.080 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:29:47.080 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:29:47.080 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:29:47.080 05:58:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:47.339 [2024-07-26 05:58:02.169208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:47.339 [2024-07-26 05:58:02.169252] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:47.339 [2024-07-26 05:58:02.169271] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19bf660 00:29:47.339 [2024-07-26 05:58:02.169284] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:47.339 [2024-07-26 05:58:02.170736] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:47.339 [2024-07-26 05:58:02.170763] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:47.339 [2024-07-26 05:58:02.170808] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:47.339 [2024-07-26 05:58:02.170834] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:47.339 [2024-07-26 05:58:02.170913] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19c1d10 00:29:47.339 [2024-07-26 05:58:02.170923] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:47.339 [2024-07-26 05:58:02.170978] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19c2560 00:29:47.339 [2024-07-26 05:58:02.171072] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19c1d10 00:29:47.339 [2024-07-26 05:58:02.171081] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19c1d10 00:29:47.339 [2024-07-26 05:58:02.171146] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:47.339 pt2 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:47.339 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:47.597 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:47.597 "name": "raid_bdev1", 00:29:47.597 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:47.597 "strip_size_kb": 0, 00:29:47.597 "state": "online", 00:29:47.597 "raid_level": "raid1", 00:29:47.597 "superblock": true, 00:29:47.597 "num_base_bdevs": 2, 00:29:47.597 "num_base_bdevs_discovered": 1, 00:29:47.597 "num_base_bdevs_operational": 1, 00:29:47.597 "base_bdevs_list": [ 00:29:47.597 { 00:29:47.597 "name": null, 00:29:47.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:47.597 "is_configured": false, 00:29:47.597 "data_offset": 256, 00:29:47.597 "data_size": 7936 00:29:47.597 }, 00:29:47.597 { 00:29:47.597 "name": "pt2", 00:29:47.597 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:47.597 "is_configured": true, 00:29:47.597 "data_offset": 256, 00:29:47.597 "data_size": 7936 00:29:47.597 } 00:29:47.597 ] 00:29:47.597 }' 00:29:47.597 05:58:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:47.597 05:58:02 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:48.164 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:48.422 [2024-07-26 05:58:03.248058] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:48.422 [2024-07-26 05:58:03.248088] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:48.422 [2024-07-26 05:58:03.248144] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:48.422 [2024-07-26 05:58:03.248188] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:48.422 [2024-07-26 05:58:03.248199] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19c1d10 name raid_bdev1, state offline 00:29:48.422 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.422 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:29:48.680 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:29:48.680 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:29:48.680 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:29:48.680 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:48.937 [2024-07-26 05:58:03.737323] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:48.937 [2024-07-26 05:58:03.737376] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:48.937 [2024-07-26 05:58:03.737396] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19c0760 00:29:48.937 [2024-07-26 05:58:03.737409] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:48.937 [2024-07-26 05:58:03.738855] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:48.937 [2024-07-26 05:58:03.738883] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:48.937 [2024-07-26 05:58:03.738933] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:48.937 [2024-07-26 05:58:03.738958] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:48.937 [2024-07-26 05:58:03.739049] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:48.937 [2024-07-26 05:58:03.739062] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:48.937 [2024-07-26 05:58:03.739078] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19c2850 name raid_bdev1, state configuring 00:29:48.937 [2024-07-26 05:58:03.739101] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:48.937 [2024-07-26 05:58:03.739152] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x19c1850 00:29:48.937 [2024-07-26 05:58:03.739162] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:48.937 [2024-07-26 05:58:03.739225] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19c23b0 00:29:48.937 [2024-07-26 05:58:03.739323] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19c1850 00:29:48.937 [2024-07-26 05:58:03.739332] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19c1850 00:29:48.937 [2024-07-26 05:58:03.739404] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:48.937 pt1 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.937 05:58:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:49.195 05:58:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:49.195 "name": "raid_bdev1", 00:29:49.195 "uuid": "bf2fc835-cff4-49fc-bde5-bd2402e0f05e", 00:29:49.195 "strip_size_kb": 0, 00:29:49.195 "state": "online", 00:29:49.195 "raid_level": "raid1", 00:29:49.195 "superblock": true, 00:29:49.195 "num_base_bdevs": 2, 00:29:49.195 "num_base_bdevs_discovered": 1, 00:29:49.195 "num_base_bdevs_operational": 1, 00:29:49.195 "base_bdevs_list": [ 00:29:49.195 { 00:29:49.195 "name": null, 00:29:49.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:49.195 "is_configured": false, 00:29:49.195 "data_offset": 256, 00:29:49.195 "data_size": 7936 00:29:49.195 }, 00:29:49.195 { 00:29:49.195 "name": "pt2", 00:29:49.195 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:49.195 "is_configured": true, 00:29:49.195 "data_offset": 256, 00:29:49.195 "data_size": 7936 00:29:49.195 } 00:29:49.195 ] 00:29:49.195 }' 00:29:49.195 05:58:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:49.195 05:58:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:49.797 05:58:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:49.797 05:58:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:50.056 05:58:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:29:50.056 05:58:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:50.056 05:58:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:29:50.315 [2024-07-26 05:58:05.089144] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' bf2fc835-cff4-49fc-bde5-bd2402e0f05e '!=' bf2fc835-cff4-49fc-bde5-bd2402e0f05e ']' 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 1275618 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1275618 ']' 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 1275618 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1275618 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1275618' 00:29:50.315 killing process with pid 1275618 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 1275618 00:29:50.315 [2024-07-26 05:58:05.153078] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:50.315 [2024-07-26 05:58:05.153136] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:50.315 [2024-07-26 05:58:05.153182] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:50.315 [2024-07-26 05:58:05.153194] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19c1850 name raid_bdev1, state offline 00:29:50.315 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 1275618 00:29:50.315 [2024-07-26 05:58:05.175908] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:50.574 05:58:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:29:50.574 00:29:50.574 real 0m15.662s 00:29:50.574 user 0m28.429s 00:29:50.574 sys 0m2.878s 00:29:50.574 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:50.574 05:58:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:50.574 ************************************ 00:29:50.574 END TEST raid_superblock_test_md_separate 00:29:50.574 ************************************ 00:29:50.574 05:58:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:50.574 05:58:05 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:29:50.574 05:58:05 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:29:50.574 05:58:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:50.574 05:58:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:50.574 05:58:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:50.574 ************************************ 00:29:50.574 START TEST raid_rebuild_test_sb_md_separate 00:29:50.574 ************************************ 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=1278033 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 1278033 /var/tmp/spdk-raid.sock 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1278033 ']' 00:29:50.574 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:50.575 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:50.575 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:50.575 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:50.575 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:50.575 05:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:50.833 [2024-07-26 05:58:05.537417] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:29:50.833 [2024-07-26 05:58:05.537487] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1278033 ] 00:29:50.833 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:50.833 Zero copy mechanism will not be used. 00:29:50.833 [2024-07-26 05:58:05.667237] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:51.092 [2024-07-26 05:58:05.769828] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:51.092 [2024-07-26 05:58:05.824296] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:51.092 [2024-07-26 05:58:05.824331] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:51.658 05:58:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:51.658 05:58:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:29:51.658 05:58:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:51.658 05:58:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:29:51.916 BaseBdev1_malloc 00:29:51.916 05:58:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:52.174 [2024-07-26 05:58:06.947833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:52.174 [2024-07-26 05:58:06.947881] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:52.174 [2024-07-26 05:58:06.947909] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21066d0 00:29:52.174 [2024-07-26 05:58:06.947921] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:52.174 [2024-07-26 05:58:06.949445] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:52.174 [2024-07-26 05:58:06.949472] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:52.174 BaseBdev1 00:29:52.174 05:58:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:52.174 05:58:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:29:52.433 BaseBdev2_malloc 00:29:52.433 05:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:52.692 [2024-07-26 05:58:07.447935] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:52.692 [2024-07-26 05:58:07.447980] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:52.692 [2024-07-26 05:58:07.448002] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x225e1f0 00:29:52.692 [2024-07-26 05:58:07.448014] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:52.692 [2024-07-26 05:58:07.449398] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:52.692 [2024-07-26 05:58:07.449425] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:52.692 BaseBdev2 00:29:52.692 05:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:29:52.950 spare_malloc 00:29:52.950 05:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:53.208 spare_delay 00:29:53.208 05:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:53.467 [2024-07-26 05:58:08.183149] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:53.467 [2024-07-26 05:58:08.183193] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:53.467 [2024-07-26 05:58:08.183218] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x225a7a0 00:29:53.467 [2024-07-26 05:58:08.183231] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:53.467 [2024-07-26 05:58:08.184661] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:53.467 [2024-07-26 05:58:08.184688] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:53.467 spare 00:29:53.467 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:53.726 [2024-07-26 05:58:08.423815] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:53.726 [2024-07-26 05:58:08.425147] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:53.726 [2024-07-26 05:58:08.425317] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x225b1c0 00:29:53.726 [2024-07-26 05:58:08.425330] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:53.726 [2024-07-26 05:58:08.425408] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x216c360 00:29:53.726 [2024-07-26 05:58:08.425522] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x225b1c0 00:29:53.726 [2024-07-26 05:58:08.425532] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x225b1c0 00:29:53.726 [2024-07-26 05:58:08.425601] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.726 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:53.984 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:53.984 "name": "raid_bdev1", 00:29:53.984 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:29:53.984 "strip_size_kb": 0, 00:29:53.984 "state": "online", 00:29:53.984 "raid_level": "raid1", 00:29:53.984 "superblock": true, 00:29:53.984 "num_base_bdevs": 2, 00:29:53.984 "num_base_bdevs_discovered": 2, 00:29:53.984 "num_base_bdevs_operational": 2, 00:29:53.984 "base_bdevs_list": [ 00:29:53.984 { 00:29:53.984 "name": "BaseBdev1", 00:29:53.984 "uuid": "03526e64-8006-51e2-9be5-e7919676d10e", 00:29:53.984 "is_configured": true, 00:29:53.984 "data_offset": 256, 00:29:53.984 "data_size": 7936 00:29:53.984 }, 00:29:53.984 { 00:29:53.984 "name": "BaseBdev2", 00:29:53.984 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:29:53.984 "is_configured": true, 00:29:53.984 "data_offset": 256, 00:29:53.984 "data_size": 7936 00:29:53.984 } 00:29:53.984 ] 00:29:53.984 }' 00:29:53.984 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:53.984 05:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:54.550 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:54.550 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:54.808 [2024-07-26 05:58:09.534991] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:54.808 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:29:54.808 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:54.808 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:55.067 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:29:55.067 [2024-07-26 05:58:09.967917] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x216c360 00:29:55.326 /dev/nbd0 00:29:55.326 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:55.326 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:55.326 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:55.326 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:29:55.326 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:55.326 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:55.326 05:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:55.326 1+0 records in 00:29:55.326 1+0 records out 00:29:55.326 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252933 s, 16.2 MB/s 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:29:55.326 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:29:55.893 7936+0 records in 00:29:55.893 7936+0 records out 00:29:55.893 32505856 bytes (33 MB, 31 MiB) copied, 0.748704 s, 43.4 MB/s 00:29:55.893 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:29:55.893 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:55.893 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:55.893 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:55.893 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:29:55.893 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:55.893 05:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:56.151 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:56.151 [2024-07-26 05:58:11.037719] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:56.151 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:56.151 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:56.151 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:56.151 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:56.151 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:56.151 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:56.151 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:56.151 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:56.410 [2024-07-26 05:58:11.274374] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:56.410 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:56.668 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:56.668 "name": "raid_bdev1", 00:29:56.668 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:29:56.668 "strip_size_kb": 0, 00:29:56.668 "state": "online", 00:29:56.668 "raid_level": "raid1", 00:29:56.668 "superblock": true, 00:29:56.668 "num_base_bdevs": 2, 00:29:56.668 "num_base_bdevs_discovered": 1, 00:29:56.668 "num_base_bdevs_operational": 1, 00:29:56.668 "base_bdevs_list": [ 00:29:56.668 { 00:29:56.668 "name": null, 00:29:56.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:56.668 "is_configured": false, 00:29:56.668 "data_offset": 256, 00:29:56.668 "data_size": 7936 00:29:56.668 }, 00:29:56.668 { 00:29:56.668 "name": "BaseBdev2", 00:29:56.668 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:29:56.668 "is_configured": true, 00:29:56.668 "data_offset": 256, 00:29:56.668 "data_size": 7936 00:29:56.668 } 00:29:56.668 ] 00:29:56.668 }' 00:29:56.669 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:56.669 05:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:57.235 05:58:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:57.493 [2024-07-26 05:58:12.285056] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:57.493 [2024-07-26 05:58:12.287350] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x225cb80 00:29:57.493 [2024-07-26 05:58:12.289626] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:57.493 05:58:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:58.428 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:58.428 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:58.428 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:58.428 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:58.428 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:58.428 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:58.428 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:58.685 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:58.685 "name": "raid_bdev1", 00:29:58.685 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:29:58.685 "strip_size_kb": 0, 00:29:58.685 "state": "online", 00:29:58.685 "raid_level": "raid1", 00:29:58.685 "superblock": true, 00:29:58.685 "num_base_bdevs": 2, 00:29:58.685 "num_base_bdevs_discovered": 2, 00:29:58.685 "num_base_bdevs_operational": 2, 00:29:58.685 "process": { 00:29:58.685 "type": "rebuild", 00:29:58.685 "target": "spare", 00:29:58.686 "progress": { 00:29:58.686 "blocks": 3072, 00:29:58.686 "percent": 38 00:29:58.686 } 00:29:58.686 }, 00:29:58.686 "base_bdevs_list": [ 00:29:58.686 { 00:29:58.686 "name": "spare", 00:29:58.686 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:29:58.686 "is_configured": true, 00:29:58.686 "data_offset": 256, 00:29:58.686 "data_size": 7936 00:29:58.686 }, 00:29:58.686 { 00:29:58.686 "name": "BaseBdev2", 00:29:58.686 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:29:58.686 "is_configured": true, 00:29:58.686 "data_offset": 256, 00:29:58.686 "data_size": 7936 00:29:58.686 } 00:29:58.686 ] 00:29:58.686 }' 00:29:58.686 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:58.943 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:58.943 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:58.943 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:58.943 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:59.202 [2024-07-26 05:58:13.890509] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:59.202 [2024-07-26 05:58:13.902460] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:59.202 [2024-07-26 05:58:13.902512] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:59.202 [2024-07-26 05:58:13.902528] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:59.202 [2024-07-26 05:58:13.902536] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:59.202 05:58:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:59.460 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:59.460 "name": "raid_bdev1", 00:29:59.460 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:29:59.460 "strip_size_kb": 0, 00:29:59.460 "state": "online", 00:29:59.460 "raid_level": "raid1", 00:29:59.460 "superblock": true, 00:29:59.460 "num_base_bdevs": 2, 00:29:59.460 "num_base_bdevs_discovered": 1, 00:29:59.460 "num_base_bdevs_operational": 1, 00:29:59.460 "base_bdevs_list": [ 00:29:59.460 { 00:29:59.460 "name": null, 00:29:59.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:59.460 "is_configured": false, 00:29:59.460 "data_offset": 256, 00:29:59.460 "data_size": 7936 00:29:59.460 }, 00:29:59.460 { 00:29:59.460 "name": "BaseBdev2", 00:29:59.460 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:29:59.460 "is_configured": true, 00:29:59.460 "data_offset": 256, 00:29:59.461 "data_size": 7936 00:29:59.461 } 00:29:59.461 ] 00:29:59.461 }' 00:29:59.461 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:59.461 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:00.028 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:00.028 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:00.028 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:00.028 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:00.028 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:00.028 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.028 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:00.286 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:00.286 "name": "raid_bdev1", 00:30:00.286 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:00.286 "strip_size_kb": 0, 00:30:00.286 "state": "online", 00:30:00.286 "raid_level": "raid1", 00:30:00.286 "superblock": true, 00:30:00.286 "num_base_bdevs": 2, 00:30:00.286 "num_base_bdevs_discovered": 1, 00:30:00.286 "num_base_bdevs_operational": 1, 00:30:00.286 "base_bdevs_list": [ 00:30:00.286 { 00:30:00.286 "name": null, 00:30:00.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:00.286 "is_configured": false, 00:30:00.286 "data_offset": 256, 00:30:00.286 "data_size": 7936 00:30:00.286 }, 00:30:00.286 { 00:30:00.286 "name": "BaseBdev2", 00:30:00.286 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:00.286 "is_configured": true, 00:30:00.286 "data_offset": 256, 00:30:00.286 "data_size": 7936 00:30:00.286 } 00:30:00.286 ] 00:30:00.286 }' 00:30:00.286 05:58:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:00.286 05:58:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:00.286 05:58:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:00.286 05:58:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:00.286 05:58:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:00.544 [2024-07-26 05:58:15.314617] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:00.544 [2024-07-26 05:58:15.317188] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2106280 00:30:00.544 [2024-07-26 05:58:15.318776] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:00.544 05:58:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:30:01.477 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:01.477 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:01.477 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:01.477 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:01.477 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:01.477 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:01.477 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:01.734 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:01.734 "name": "raid_bdev1", 00:30:01.734 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:01.734 "strip_size_kb": 0, 00:30:01.734 "state": "online", 00:30:01.734 "raid_level": "raid1", 00:30:01.734 "superblock": true, 00:30:01.734 "num_base_bdevs": 2, 00:30:01.734 "num_base_bdevs_discovered": 2, 00:30:01.734 "num_base_bdevs_operational": 2, 00:30:01.734 "process": { 00:30:01.734 "type": "rebuild", 00:30:01.734 "target": "spare", 00:30:01.734 "progress": { 00:30:01.734 "blocks": 3072, 00:30:01.734 "percent": 38 00:30:01.734 } 00:30:01.734 }, 00:30:01.734 "base_bdevs_list": [ 00:30:01.734 { 00:30:01.734 "name": "spare", 00:30:01.734 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:01.734 "is_configured": true, 00:30:01.734 "data_offset": 256, 00:30:01.734 "data_size": 7936 00:30:01.734 }, 00:30:01.734 { 00:30:01.734 "name": "BaseBdev2", 00:30:01.734 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:01.734 "is_configured": true, 00:30:01.734 "data_offset": 256, 00:30:01.734 "data_size": 7936 00:30:01.734 } 00:30:01.734 ] 00:30:01.734 }' 00:30:01.734 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:01.734 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:01.734 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:30:01.990 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1066 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:01.990 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:02.248 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:02.248 "name": "raid_bdev1", 00:30:02.248 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:02.248 "strip_size_kb": 0, 00:30:02.248 "state": "online", 00:30:02.248 "raid_level": "raid1", 00:30:02.248 "superblock": true, 00:30:02.248 "num_base_bdevs": 2, 00:30:02.248 "num_base_bdevs_discovered": 2, 00:30:02.248 "num_base_bdevs_operational": 2, 00:30:02.248 "process": { 00:30:02.248 "type": "rebuild", 00:30:02.248 "target": "spare", 00:30:02.248 "progress": { 00:30:02.248 "blocks": 3840, 00:30:02.248 "percent": 48 00:30:02.248 } 00:30:02.248 }, 00:30:02.248 "base_bdevs_list": [ 00:30:02.248 { 00:30:02.248 "name": "spare", 00:30:02.248 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:02.248 "is_configured": true, 00:30:02.248 "data_offset": 256, 00:30:02.248 "data_size": 7936 00:30:02.248 }, 00:30:02.248 { 00:30:02.248 "name": "BaseBdev2", 00:30:02.248 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:02.248 "is_configured": true, 00:30:02.248 "data_offset": 256, 00:30:02.248 "data_size": 7936 00:30:02.248 } 00:30:02.248 ] 00:30:02.248 }' 00:30:02.248 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:02.248 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:02.248 05:58:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:02.248 05:58:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:02.248 05:58:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:03.181 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:03.181 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:03.181 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:03.181 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:03.181 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:03.181 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:03.181 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:03.181 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:03.439 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:03.439 "name": "raid_bdev1", 00:30:03.439 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:03.439 "strip_size_kb": 0, 00:30:03.439 "state": "online", 00:30:03.439 "raid_level": "raid1", 00:30:03.439 "superblock": true, 00:30:03.439 "num_base_bdevs": 2, 00:30:03.439 "num_base_bdevs_discovered": 2, 00:30:03.439 "num_base_bdevs_operational": 2, 00:30:03.439 "process": { 00:30:03.439 "type": "rebuild", 00:30:03.439 "target": "spare", 00:30:03.439 "progress": { 00:30:03.439 "blocks": 7424, 00:30:03.439 "percent": 93 00:30:03.439 } 00:30:03.439 }, 00:30:03.439 "base_bdevs_list": [ 00:30:03.439 { 00:30:03.439 "name": "spare", 00:30:03.439 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:03.439 "is_configured": true, 00:30:03.439 "data_offset": 256, 00:30:03.439 "data_size": 7936 00:30:03.439 }, 00:30:03.439 { 00:30:03.439 "name": "BaseBdev2", 00:30:03.439 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:03.439 "is_configured": true, 00:30:03.439 "data_offset": 256, 00:30:03.439 "data_size": 7936 00:30:03.439 } 00:30:03.439 ] 00:30:03.439 }' 00:30:03.439 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:03.439 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:03.439 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:03.697 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:03.697 05:58:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:03.697 [2024-07-26 05:58:18.443902] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:30:03.697 [2024-07-26 05:58:18.443959] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:30:03.697 [2024-07-26 05:58:18.444041] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:04.630 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:04.630 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:04.630 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:04.630 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:04.630 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:04.630 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:04.630 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.630 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:04.888 "name": "raid_bdev1", 00:30:04.888 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:04.888 "strip_size_kb": 0, 00:30:04.888 "state": "online", 00:30:04.888 "raid_level": "raid1", 00:30:04.888 "superblock": true, 00:30:04.888 "num_base_bdevs": 2, 00:30:04.888 "num_base_bdevs_discovered": 2, 00:30:04.888 "num_base_bdevs_operational": 2, 00:30:04.888 "base_bdevs_list": [ 00:30:04.888 { 00:30:04.888 "name": "spare", 00:30:04.888 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:04.888 "is_configured": true, 00:30:04.888 "data_offset": 256, 00:30:04.888 "data_size": 7936 00:30:04.888 }, 00:30:04.888 { 00:30:04.888 "name": "BaseBdev2", 00:30:04.888 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:04.888 "is_configured": true, 00:30:04.888 "data_offset": 256, 00:30:04.888 "data_size": 7936 00:30:04.888 } 00:30:04.888 ] 00:30:04.888 }' 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.888 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:05.146 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:05.146 "name": "raid_bdev1", 00:30:05.146 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:05.146 "strip_size_kb": 0, 00:30:05.146 "state": "online", 00:30:05.146 "raid_level": "raid1", 00:30:05.146 "superblock": true, 00:30:05.146 "num_base_bdevs": 2, 00:30:05.146 "num_base_bdevs_discovered": 2, 00:30:05.146 "num_base_bdevs_operational": 2, 00:30:05.146 "base_bdevs_list": [ 00:30:05.146 { 00:30:05.146 "name": "spare", 00:30:05.146 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:05.146 "is_configured": true, 00:30:05.146 "data_offset": 256, 00:30:05.146 "data_size": 7936 00:30:05.146 }, 00:30:05.146 { 00:30:05.146 "name": "BaseBdev2", 00:30:05.146 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:05.146 "is_configured": true, 00:30:05.146 "data_offset": 256, 00:30:05.146 "data_size": 7936 00:30:05.146 } 00:30:05.146 ] 00:30:05.146 }' 00:30:05.146 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:05.146 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:05.146 05:58:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:05.146 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:05.404 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:05.404 "name": "raid_bdev1", 00:30:05.404 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:05.404 "strip_size_kb": 0, 00:30:05.404 "state": "online", 00:30:05.404 "raid_level": "raid1", 00:30:05.404 "superblock": true, 00:30:05.404 "num_base_bdevs": 2, 00:30:05.404 "num_base_bdevs_discovered": 2, 00:30:05.404 "num_base_bdevs_operational": 2, 00:30:05.404 "base_bdevs_list": [ 00:30:05.404 { 00:30:05.404 "name": "spare", 00:30:05.404 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:05.404 "is_configured": true, 00:30:05.404 "data_offset": 256, 00:30:05.404 "data_size": 7936 00:30:05.404 }, 00:30:05.404 { 00:30:05.404 "name": "BaseBdev2", 00:30:05.404 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:05.404 "is_configured": true, 00:30:05.404 "data_offset": 256, 00:30:05.404 "data_size": 7936 00:30:05.404 } 00:30:05.404 ] 00:30:05.404 }' 00:30:05.404 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:05.404 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:06.338 05:58:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:06.338 [2024-07-26 05:58:21.102793] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:06.338 [2024-07-26 05:58:21.102821] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:06.338 [2024-07-26 05:58:21.102877] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:06.338 [2024-07-26 05:58:21.102934] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:06.338 [2024-07-26 05:58:21.102946] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x225b1c0 name raid_bdev1, state offline 00:30:06.338 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:06.338 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:06.644 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:30:06.927 /dev/nbd0 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:06.927 1+0 records in 00:30:06.927 1+0 records out 00:30:06.927 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227568 s, 18.0 MB/s 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:06.927 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:30:07.186 /dev/nbd1 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:07.186 1+0 records in 00:30:07.186 1+0 records out 00:30:07.186 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310302 s, 13.2 MB/s 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:07.186 05:58:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:30:07.186 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:30:07.186 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:07.186 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:07.186 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:07.186 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:30:07.186 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:07.186 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:30:07.445 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:07.445 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:07.446 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:07.446 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:07.446 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:07.446 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:07.446 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:30:07.446 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:30:07.446 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:07.446 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:30:07.705 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:07.964 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:08.223 [2024-07-26 05:58:22.951780] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:08.223 [2024-07-26 05:58:22.951825] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:08.223 [2024-07-26 05:58:22.951847] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x216bca0 00:30:08.223 [2024-07-26 05:58:22.951860] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:08.223 [2024-07-26 05:58:22.953335] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:08.223 [2024-07-26 05:58:22.953361] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:08.223 [2024-07-26 05:58:22.953415] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:08.223 [2024-07-26 05:58:22.953439] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:08.223 [2024-07-26 05:58:22.953531] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:08.223 spare 00:30:08.223 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:08.223 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:08.223 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:08.223 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:08.223 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:08.223 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:08.223 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:08.223 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:08.224 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:08.224 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:08.224 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:08.224 05:58:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:08.224 [2024-07-26 05:58:23.053834] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x2104df0 00:30:08.224 [2024-07-26 05:58:23.053851] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:30:08.224 [2024-07-26 05:58:23.053921] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x225c460 00:30:08.224 [2024-07-26 05:58:23.054046] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2104df0 00:30:08.224 [2024-07-26 05:58:23.054056] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2104df0 00:30:08.224 [2024-07-26 05:58:23.054132] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:08.483 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:08.483 "name": "raid_bdev1", 00:30:08.483 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:08.483 "strip_size_kb": 0, 00:30:08.483 "state": "online", 00:30:08.483 "raid_level": "raid1", 00:30:08.483 "superblock": true, 00:30:08.483 "num_base_bdevs": 2, 00:30:08.483 "num_base_bdevs_discovered": 2, 00:30:08.483 "num_base_bdevs_operational": 2, 00:30:08.483 "base_bdevs_list": [ 00:30:08.483 { 00:30:08.483 "name": "spare", 00:30:08.483 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:08.483 "is_configured": true, 00:30:08.483 "data_offset": 256, 00:30:08.483 "data_size": 7936 00:30:08.483 }, 00:30:08.483 { 00:30:08.483 "name": "BaseBdev2", 00:30:08.483 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:08.483 "is_configured": true, 00:30:08.483 "data_offset": 256, 00:30:08.483 "data_size": 7936 00:30:08.483 } 00:30:08.483 ] 00:30:08.483 }' 00:30:08.483 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:08.483 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:09.048 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:09.048 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:09.048 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:09.048 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:09.048 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:09.048 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.048 05:58:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:09.307 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:09.307 "name": "raid_bdev1", 00:30:09.307 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:09.307 "strip_size_kb": 0, 00:30:09.307 "state": "online", 00:30:09.307 "raid_level": "raid1", 00:30:09.307 "superblock": true, 00:30:09.307 "num_base_bdevs": 2, 00:30:09.307 "num_base_bdevs_discovered": 2, 00:30:09.307 "num_base_bdevs_operational": 2, 00:30:09.307 "base_bdevs_list": [ 00:30:09.307 { 00:30:09.307 "name": "spare", 00:30:09.307 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:09.307 "is_configured": true, 00:30:09.307 "data_offset": 256, 00:30:09.307 "data_size": 7936 00:30:09.307 }, 00:30:09.307 { 00:30:09.307 "name": "BaseBdev2", 00:30:09.307 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:09.307 "is_configured": true, 00:30:09.307 "data_offset": 256, 00:30:09.307 "data_size": 7936 00:30:09.307 } 00:30:09.307 ] 00:30:09.307 }' 00:30:09.307 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:09.307 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:09.307 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:09.307 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:09.307 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.307 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:30:09.566 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:30:09.566 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:09.824 [2024-07-26 05:58:24.628348] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.824 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:10.082 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:10.082 "name": "raid_bdev1", 00:30:10.082 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:10.082 "strip_size_kb": 0, 00:30:10.082 "state": "online", 00:30:10.082 "raid_level": "raid1", 00:30:10.082 "superblock": true, 00:30:10.082 "num_base_bdevs": 2, 00:30:10.082 "num_base_bdevs_discovered": 1, 00:30:10.082 "num_base_bdevs_operational": 1, 00:30:10.082 "base_bdevs_list": [ 00:30:10.082 { 00:30:10.082 "name": null, 00:30:10.082 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:10.082 "is_configured": false, 00:30:10.082 "data_offset": 256, 00:30:10.082 "data_size": 7936 00:30:10.082 }, 00:30:10.082 { 00:30:10.082 "name": "BaseBdev2", 00:30:10.082 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:10.082 "is_configured": true, 00:30:10.082 "data_offset": 256, 00:30:10.082 "data_size": 7936 00:30:10.082 } 00:30:10.082 ] 00:30:10.082 }' 00:30:10.082 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:10.082 05:58:24 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:10.647 05:58:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:10.904 [2024-07-26 05:58:25.651065] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:10.904 [2024-07-26 05:58:25.651215] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:10.904 [2024-07-26 05:58:25.651232] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:10.904 [2024-07-26 05:58:25.651260] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:10.904 [2024-07-26 05:58:25.653422] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x225c570 00:30:10.904 [2024-07-26 05:58:25.654760] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:10.904 05:58:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:30:11.836 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:11.836 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:11.836 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:11.836 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:11.836 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:11.836 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.836 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:12.094 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:12.094 "name": "raid_bdev1", 00:30:12.094 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:12.094 "strip_size_kb": 0, 00:30:12.094 "state": "online", 00:30:12.094 "raid_level": "raid1", 00:30:12.094 "superblock": true, 00:30:12.094 "num_base_bdevs": 2, 00:30:12.094 "num_base_bdevs_discovered": 2, 00:30:12.094 "num_base_bdevs_operational": 2, 00:30:12.094 "process": { 00:30:12.094 "type": "rebuild", 00:30:12.094 "target": "spare", 00:30:12.094 "progress": { 00:30:12.094 "blocks": 2816, 00:30:12.094 "percent": 35 00:30:12.094 } 00:30:12.094 }, 00:30:12.094 "base_bdevs_list": [ 00:30:12.094 { 00:30:12.094 "name": "spare", 00:30:12.094 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:12.094 "is_configured": true, 00:30:12.094 "data_offset": 256, 00:30:12.094 "data_size": 7936 00:30:12.094 }, 00:30:12.094 { 00:30:12.094 "name": "BaseBdev2", 00:30:12.094 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:12.094 "is_configured": true, 00:30:12.094 "data_offset": 256, 00:30:12.094 "data_size": 7936 00:30:12.094 } 00:30:12.094 ] 00:30:12.094 }' 00:30:12.094 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:12.094 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:12.094 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:12.094 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:12.094 05:58:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:12.352 [2024-07-26 05:58:27.176848] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:12.610 [2024-07-26 05:58:27.267428] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:12.610 [2024-07-26 05:58:27.267477] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:12.610 [2024-07-26 05:58:27.267493] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:12.610 [2024-07-26 05:58:27.267501] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.610 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:12.867 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:12.867 "name": "raid_bdev1", 00:30:12.867 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:12.867 "strip_size_kb": 0, 00:30:12.867 "state": "online", 00:30:12.867 "raid_level": "raid1", 00:30:12.867 "superblock": true, 00:30:12.867 "num_base_bdevs": 2, 00:30:12.867 "num_base_bdevs_discovered": 1, 00:30:12.867 "num_base_bdevs_operational": 1, 00:30:12.867 "base_bdevs_list": [ 00:30:12.867 { 00:30:12.867 "name": null, 00:30:12.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:12.867 "is_configured": false, 00:30:12.867 "data_offset": 256, 00:30:12.867 "data_size": 7936 00:30:12.867 }, 00:30:12.867 { 00:30:12.867 "name": "BaseBdev2", 00:30:12.867 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:12.867 "is_configured": true, 00:30:12.867 "data_offset": 256, 00:30:12.867 "data_size": 7936 00:30:12.867 } 00:30:12.867 ] 00:30:12.867 }' 00:30:12.867 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:12.867 05:58:27 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:13.432 05:58:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:13.690 [2024-07-26 05:58:28.357450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:13.690 [2024-07-26 05:58:28.357499] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:13.690 [2024-07-26 05:58:28.357520] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x216e220 00:30:13.690 [2024-07-26 05:58:28.357532] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:13.690 [2024-07-26 05:58:28.357760] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:13.690 [2024-07-26 05:58:28.357777] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:13.690 [2024-07-26 05:58:28.357833] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:13.690 [2024-07-26 05:58:28.357845] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:13.690 [2024-07-26 05:58:28.357855] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:13.690 [2024-07-26 05:58:28.357874] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:13.690 [2024-07-26 05:58:28.360081] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2105980 00:30:13.690 [2024-07-26 05:58:28.361412] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:13.690 spare 00:30:13.690 05:58:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:30:14.624 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:14.624 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:14.624 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:14.624 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:14.624 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:14.624 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:14.624 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:14.882 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:14.882 "name": "raid_bdev1", 00:30:14.882 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:14.882 "strip_size_kb": 0, 00:30:14.882 "state": "online", 00:30:14.882 "raid_level": "raid1", 00:30:14.882 "superblock": true, 00:30:14.882 "num_base_bdevs": 2, 00:30:14.882 "num_base_bdevs_discovered": 2, 00:30:14.882 "num_base_bdevs_operational": 2, 00:30:14.882 "process": { 00:30:14.882 "type": "rebuild", 00:30:14.882 "target": "spare", 00:30:14.882 "progress": { 00:30:14.882 "blocks": 3072, 00:30:14.882 "percent": 38 00:30:14.882 } 00:30:14.882 }, 00:30:14.882 "base_bdevs_list": [ 00:30:14.882 { 00:30:14.882 "name": "spare", 00:30:14.882 "uuid": "63e9ce51-727c-5a73-b826-fd39c4851d63", 00:30:14.882 "is_configured": true, 00:30:14.882 "data_offset": 256, 00:30:14.882 "data_size": 7936 00:30:14.882 }, 00:30:14.882 { 00:30:14.882 "name": "BaseBdev2", 00:30:14.882 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:14.882 "is_configured": true, 00:30:14.882 "data_offset": 256, 00:30:14.882 "data_size": 7936 00:30:14.882 } 00:30:14.882 ] 00:30:14.882 }' 00:30:14.882 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:14.882 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:14.882 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:14.882 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:14.882 05:58:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:15.141 [2024-07-26 05:58:29.955611] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:15.141 [2024-07-26 05:58:29.974003] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:15.141 [2024-07-26 05:58:29.974061] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:15.141 [2024-07-26 05:58:29.974077] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:15.141 [2024-07-26 05:58:29.974085] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:15.141 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:15.399 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:15.399 "name": "raid_bdev1", 00:30:15.399 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:15.399 "strip_size_kb": 0, 00:30:15.399 "state": "online", 00:30:15.399 "raid_level": "raid1", 00:30:15.399 "superblock": true, 00:30:15.399 "num_base_bdevs": 2, 00:30:15.399 "num_base_bdevs_discovered": 1, 00:30:15.399 "num_base_bdevs_operational": 1, 00:30:15.399 "base_bdevs_list": [ 00:30:15.399 { 00:30:15.399 "name": null, 00:30:15.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:15.399 "is_configured": false, 00:30:15.399 "data_offset": 256, 00:30:15.399 "data_size": 7936 00:30:15.399 }, 00:30:15.399 { 00:30:15.399 "name": "BaseBdev2", 00:30:15.399 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:15.399 "is_configured": true, 00:30:15.399 "data_offset": 256, 00:30:15.399 "data_size": 7936 00:30:15.399 } 00:30:15.399 ] 00:30:15.399 }' 00:30:15.399 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:15.399 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:15.965 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:15.965 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:15.965 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:15.965 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:15.965 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:15.965 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:15.965 05:58:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:16.224 05:58:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:16.224 "name": "raid_bdev1", 00:30:16.224 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:16.224 "strip_size_kb": 0, 00:30:16.224 "state": "online", 00:30:16.224 "raid_level": "raid1", 00:30:16.224 "superblock": true, 00:30:16.224 "num_base_bdevs": 2, 00:30:16.224 "num_base_bdevs_discovered": 1, 00:30:16.224 "num_base_bdevs_operational": 1, 00:30:16.224 "base_bdevs_list": [ 00:30:16.224 { 00:30:16.224 "name": null, 00:30:16.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:16.224 "is_configured": false, 00:30:16.224 "data_offset": 256, 00:30:16.224 "data_size": 7936 00:30:16.224 }, 00:30:16.224 { 00:30:16.224 "name": "BaseBdev2", 00:30:16.224 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:16.224 "is_configured": true, 00:30:16.224 "data_offset": 256, 00:30:16.224 "data_size": 7936 00:30:16.224 } 00:30:16.224 ] 00:30:16.224 }' 00:30:16.224 05:58:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:16.483 05:58:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:16.483 05:58:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:16.483 05:58:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:16.483 05:58:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:30:16.741 05:58:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:17.000 [2024-07-26 05:58:31.654142] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:17.000 [2024-07-26 05:58:31.654187] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:17.000 [2024-07-26 05:58:31.654208] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2106900 00:30:17.000 [2024-07-26 05:58:31.654221] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:17.000 [2024-07-26 05:58:31.654414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:17.000 [2024-07-26 05:58:31.654430] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:17.000 [2024-07-26 05:58:31.654475] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:30:17.000 [2024-07-26 05:58:31.654487] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:17.000 [2024-07-26 05:58:31.654497] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:17.000 BaseBdev1 00:30:17.000 05:58:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:17.935 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:17.936 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:18.194 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:18.194 "name": "raid_bdev1", 00:30:18.194 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:18.194 "strip_size_kb": 0, 00:30:18.194 "state": "online", 00:30:18.194 "raid_level": "raid1", 00:30:18.194 "superblock": true, 00:30:18.194 "num_base_bdevs": 2, 00:30:18.194 "num_base_bdevs_discovered": 1, 00:30:18.194 "num_base_bdevs_operational": 1, 00:30:18.194 "base_bdevs_list": [ 00:30:18.194 { 00:30:18.194 "name": null, 00:30:18.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:18.194 "is_configured": false, 00:30:18.194 "data_offset": 256, 00:30:18.194 "data_size": 7936 00:30:18.194 }, 00:30:18.194 { 00:30:18.194 "name": "BaseBdev2", 00:30:18.194 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:18.194 "is_configured": true, 00:30:18.194 "data_offset": 256, 00:30:18.194 "data_size": 7936 00:30:18.194 } 00:30:18.194 ] 00:30:18.194 }' 00:30:18.194 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:18.194 05:58:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:18.762 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:18.762 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:18.762 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:18.762 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:18.762 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:18.762 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:18.762 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:19.020 "name": "raid_bdev1", 00:30:19.020 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:19.020 "strip_size_kb": 0, 00:30:19.020 "state": "online", 00:30:19.020 "raid_level": "raid1", 00:30:19.020 "superblock": true, 00:30:19.020 "num_base_bdevs": 2, 00:30:19.020 "num_base_bdevs_discovered": 1, 00:30:19.020 "num_base_bdevs_operational": 1, 00:30:19.020 "base_bdevs_list": [ 00:30:19.020 { 00:30:19.020 "name": null, 00:30:19.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:19.020 "is_configured": false, 00:30:19.020 "data_offset": 256, 00:30:19.020 "data_size": 7936 00:30:19.020 }, 00:30:19.020 { 00:30:19.020 "name": "BaseBdev2", 00:30:19.020 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:19.020 "is_configured": true, 00:30:19.020 "data_offset": 256, 00:30:19.020 "data_size": 7936 00:30:19.020 } 00:30:19.020 ] 00:30:19.020 }' 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:19.020 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:19.021 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:19.021 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:19.021 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:19.021 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:19.021 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:19.021 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:19.021 05:58:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:19.588 [2024-07-26 05:58:34.357392] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:19.588 [2024-07-26 05:58:34.357526] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:19.588 [2024-07-26 05:58:34.357542] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:19.588 request: 00:30:19.588 { 00:30:19.588 "base_bdev": "BaseBdev1", 00:30:19.588 "raid_bdev": "raid_bdev1", 00:30:19.588 "method": "bdev_raid_add_base_bdev", 00:30:19.588 "req_id": 1 00:30:19.588 } 00:30:19.588 Got JSON-RPC error response 00:30:19.588 response: 00:30:19.588 { 00:30:19.588 "code": -22, 00:30:19.588 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:30:19.588 } 00:30:19.588 05:58:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:30:19.588 05:58:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:30:19.588 05:58:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:30:19.588 05:58:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:30:19.588 05:58:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:20.523 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:21.091 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:21.091 "name": "raid_bdev1", 00:30:21.091 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:21.091 "strip_size_kb": 0, 00:30:21.091 "state": "online", 00:30:21.091 "raid_level": "raid1", 00:30:21.091 "superblock": true, 00:30:21.091 "num_base_bdevs": 2, 00:30:21.091 "num_base_bdevs_discovered": 1, 00:30:21.091 "num_base_bdevs_operational": 1, 00:30:21.091 "base_bdevs_list": [ 00:30:21.091 { 00:30:21.091 "name": null, 00:30:21.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:21.091 "is_configured": false, 00:30:21.091 "data_offset": 256, 00:30:21.091 "data_size": 7936 00:30:21.091 }, 00:30:21.091 { 00:30:21.091 "name": "BaseBdev2", 00:30:21.091 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:21.091 "is_configured": true, 00:30:21.091 "data_offset": 256, 00:30:21.091 "data_size": 7936 00:30:21.091 } 00:30:21.091 ] 00:30:21.091 }' 00:30:21.091 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:21.091 05:58:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:21.659 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:21.659 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:21.659 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:21.659 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:21.659 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:21.659 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:21.659 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:21.918 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:21.918 "name": "raid_bdev1", 00:30:21.918 "uuid": "7d40c565-a799-4e9a-b876-4cc332aab6ff", 00:30:21.918 "strip_size_kb": 0, 00:30:21.918 "state": "online", 00:30:21.918 "raid_level": "raid1", 00:30:21.918 "superblock": true, 00:30:21.918 "num_base_bdevs": 2, 00:30:21.918 "num_base_bdevs_discovered": 1, 00:30:21.918 "num_base_bdevs_operational": 1, 00:30:21.918 "base_bdevs_list": [ 00:30:21.918 { 00:30:21.918 "name": null, 00:30:21.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:21.918 "is_configured": false, 00:30:21.918 "data_offset": 256, 00:30:21.918 "data_size": 7936 00:30:21.918 }, 00:30:21.918 { 00:30:21.918 "name": "BaseBdev2", 00:30:21.918 "uuid": "fd3dbe42-efb2-59af-8443-6919c5cec2db", 00:30:21.918 "is_configured": true, 00:30:21.918 "data_offset": 256, 00:30:21.918 "data_size": 7936 00:30:21.918 } 00:30:21.918 ] 00:30:21.918 }' 00:30:21.918 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:21.918 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:21.918 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 1278033 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1278033 ']' 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1278033 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1278033 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1278033' 00:30:22.177 killing process with pid 1278033 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1278033 00:30:22.177 Received shutdown signal, test time was about 60.000000 seconds 00:30:22.177 00:30:22.177 Latency(us) 00:30:22.177 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:22.177 =================================================================================================================== 00:30:22.177 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:30:22.177 [2024-07-26 05:58:36.882070] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:22.177 [2024-07-26 05:58:36.882158] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:22.177 [2024-07-26 05:58:36.882202] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:22.177 [2024-07-26 05:58:36.882214] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2104df0 name raid_bdev1, state offline 00:30:22.177 05:58:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1278033 00:30:22.177 [2024-07-26 05:58:36.920906] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:22.437 05:58:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:30:22.437 00:30:22.437 real 0m31.679s 00:30:22.437 user 0m49.374s 00:30:22.437 sys 0m5.172s 00:30:22.437 05:58:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:22.437 05:58:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:22.437 ************************************ 00:30:22.437 END TEST raid_rebuild_test_sb_md_separate 00:30:22.437 ************************************ 00:30:22.437 05:58:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:22.437 05:58:37 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:30:22.437 05:58:37 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:30:22.437 05:58:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:30:22.437 05:58:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:22.437 05:58:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:22.437 ************************************ 00:30:22.437 START TEST raid_state_function_test_sb_md_interleaved 00:30:22.437 ************************************ 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1282567 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1282567' 00:30:22.437 Process raid pid: 1282567 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1282567 /var/tmp/spdk-raid.sock 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1282567 ']' 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:22.437 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:22.437 05:58:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:22.437 [2024-07-26 05:58:37.302350] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:30:22.437 [2024-07-26 05:58:37.302403] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:22.696 [2024-07-26 05:58:37.415421] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:22.696 [2024-07-26 05:58:37.511906] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:22.696 [2024-07-26 05:58:37.573604] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:22.696 [2024-07-26 05:58:37.573652] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:23.631 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:30:23.632 [2024-07-26 05:58:38.476586] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:23.632 [2024-07-26 05:58:38.476628] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:23.632 [2024-07-26 05:58:38.476645] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:23.632 [2024-07-26 05:58:38.476657] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:23.632 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:23.892 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:23.892 "name": "Existed_Raid", 00:30:23.892 "uuid": "e1fe9330-a660-4abe-a241-7f8e960f0760", 00:30:23.892 "strip_size_kb": 0, 00:30:23.892 "state": "configuring", 00:30:23.892 "raid_level": "raid1", 00:30:23.892 "superblock": true, 00:30:23.892 "num_base_bdevs": 2, 00:30:23.892 "num_base_bdevs_discovered": 0, 00:30:23.892 "num_base_bdevs_operational": 2, 00:30:23.892 "base_bdevs_list": [ 00:30:23.892 { 00:30:23.892 "name": "BaseBdev1", 00:30:23.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:23.892 "is_configured": false, 00:30:23.892 "data_offset": 0, 00:30:23.892 "data_size": 0 00:30:23.892 }, 00:30:23.892 { 00:30:23.892 "name": "BaseBdev2", 00:30:23.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:23.892 "is_configured": false, 00:30:23.892 "data_offset": 0, 00:30:23.892 "data_size": 0 00:30:23.892 } 00:30:23.892 ] 00:30:23.892 }' 00:30:23.892 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:23.892 05:58:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:24.490 05:58:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:24.748 [2024-07-26 05:58:39.487108] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:24.748 [2024-07-26 05:58:39.487138] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e3a80 name Existed_Raid, state configuring 00:30:24.748 05:58:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:30:25.007 [2024-07-26 05:58:39.739792] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:25.007 [2024-07-26 05:58:39.739818] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:25.007 [2024-07-26 05:58:39.739828] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:25.007 [2024-07-26 05:58:39.739839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:25.007 05:58:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:30:25.265 [2024-07-26 05:58:39.998467] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:25.265 BaseBdev1 00:30:25.265 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:30:25.265 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:30:25.265 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:25.265 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:30:25.265 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:25.265 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:25.265 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:25.525 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:30:25.784 [ 00:30:25.784 { 00:30:25.784 "name": "BaseBdev1", 00:30:25.784 "aliases": [ 00:30:25.784 "0cd6b7c8-8939-44fe-8f78-548f616304a9" 00:30:25.784 ], 00:30:25.784 "product_name": "Malloc disk", 00:30:25.784 "block_size": 4128, 00:30:25.784 "num_blocks": 8192, 00:30:25.784 "uuid": "0cd6b7c8-8939-44fe-8f78-548f616304a9", 00:30:25.784 "md_size": 32, 00:30:25.784 "md_interleave": true, 00:30:25.784 "dif_type": 0, 00:30:25.784 "assigned_rate_limits": { 00:30:25.784 "rw_ios_per_sec": 0, 00:30:25.784 "rw_mbytes_per_sec": 0, 00:30:25.784 "r_mbytes_per_sec": 0, 00:30:25.784 "w_mbytes_per_sec": 0 00:30:25.784 }, 00:30:25.784 "claimed": true, 00:30:25.784 "claim_type": "exclusive_write", 00:30:25.784 "zoned": false, 00:30:25.784 "supported_io_types": { 00:30:25.784 "read": true, 00:30:25.784 "write": true, 00:30:25.784 "unmap": true, 00:30:25.784 "flush": true, 00:30:25.784 "reset": true, 00:30:25.784 "nvme_admin": false, 00:30:25.784 "nvme_io": false, 00:30:25.784 "nvme_io_md": false, 00:30:25.784 "write_zeroes": true, 00:30:25.784 "zcopy": true, 00:30:25.784 "get_zone_info": false, 00:30:25.784 "zone_management": false, 00:30:25.784 "zone_append": false, 00:30:25.784 "compare": false, 00:30:25.784 "compare_and_write": false, 00:30:25.784 "abort": true, 00:30:25.784 "seek_hole": false, 00:30:25.784 "seek_data": false, 00:30:25.784 "copy": true, 00:30:25.784 "nvme_iov_md": false 00:30:25.784 }, 00:30:25.784 "memory_domains": [ 00:30:25.784 { 00:30:25.784 "dma_device_id": "system", 00:30:25.784 "dma_device_type": 1 00:30:25.784 }, 00:30:25.784 { 00:30:25.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:25.784 "dma_device_type": 2 00:30:25.784 } 00:30:25.784 ], 00:30:25.784 "driver_specific": {} 00:30:25.784 } 00:30:25.784 ] 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:25.784 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:26.043 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:26.043 "name": "Existed_Raid", 00:30:26.043 "uuid": "67306ca2-eb94-4a3e-bab2-38ca74e24242", 00:30:26.043 "strip_size_kb": 0, 00:30:26.043 "state": "configuring", 00:30:26.043 "raid_level": "raid1", 00:30:26.043 "superblock": true, 00:30:26.043 "num_base_bdevs": 2, 00:30:26.043 "num_base_bdevs_discovered": 1, 00:30:26.043 "num_base_bdevs_operational": 2, 00:30:26.043 "base_bdevs_list": [ 00:30:26.043 { 00:30:26.043 "name": "BaseBdev1", 00:30:26.043 "uuid": "0cd6b7c8-8939-44fe-8f78-548f616304a9", 00:30:26.043 "is_configured": true, 00:30:26.043 "data_offset": 256, 00:30:26.043 "data_size": 7936 00:30:26.043 }, 00:30:26.043 { 00:30:26.043 "name": "BaseBdev2", 00:30:26.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:26.043 "is_configured": false, 00:30:26.043 "data_offset": 0, 00:30:26.043 "data_size": 0 00:30:26.043 } 00:30:26.043 ] 00:30:26.043 }' 00:30:26.043 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:26.044 05:58:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:26.613 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:26.613 [2024-07-26 05:58:41.518544] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:26.613 [2024-07-26 05:58:41.518588] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e3350 name Existed_Raid, state configuring 00:30:26.871 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:30:26.871 [2024-07-26 05:58:41.763223] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:26.871 [2024-07-26 05:58:41.764705] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:26.871 [2024-07-26 05:58:41.764737] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:27.130 05:58:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:27.130 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:27.130 "name": "Existed_Raid", 00:30:27.130 "uuid": "9f796133-5570-4ebc-8a15-77b4dd06a238", 00:30:27.130 "strip_size_kb": 0, 00:30:27.131 "state": "configuring", 00:30:27.131 "raid_level": "raid1", 00:30:27.131 "superblock": true, 00:30:27.131 "num_base_bdevs": 2, 00:30:27.131 "num_base_bdevs_discovered": 1, 00:30:27.131 "num_base_bdevs_operational": 2, 00:30:27.131 "base_bdevs_list": [ 00:30:27.131 { 00:30:27.131 "name": "BaseBdev1", 00:30:27.131 "uuid": "0cd6b7c8-8939-44fe-8f78-548f616304a9", 00:30:27.131 "is_configured": true, 00:30:27.131 "data_offset": 256, 00:30:27.131 "data_size": 7936 00:30:27.131 }, 00:30:27.131 { 00:30:27.131 "name": "BaseBdev2", 00:30:27.131 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:27.131 "is_configured": false, 00:30:27.131 "data_offset": 0, 00:30:27.131 "data_size": 0 00:30:27.131 } 00:30:27.131 ] 00:30:27.131 }' 00:30:27.131 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:27.131 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:27.699 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:30:27.958 [2024-07-26 05:58:42.837668] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:27.958 [2024-07-26 05:58:42.837805] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x24e5180 00:30:27.958 [2024-07-26 05:58:42.837819] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:27.958 [2024-07-26 05:58:42.837881] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e5150 00:30:27.958 [2024-07-26 05:58:42.837955] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24e5180 00:30:27.958 [2024-07-26 05:58:42.837965] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24e5180 00:30:27.958 [2024-07-26 05:58:42.838021] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:27.958 BaseBdev2 00:30:27.958 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:30:27.958 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:30:27.958 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:27.958 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:30:27.958 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:27.958 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:27.958 05:58:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:28.217 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:30:28.477 [ 00:30:28.477 { 00:30:28.477 "name": "BaseBdev2", 00:30:28.477 "aliases": [ 00:30:28.477 "28aa3d21-aab9-4ea8-88da-169618df2157" 00:30:28.477 ], 00:30:28.477 "product_name": "Malloc disk", 00:30:28.477 "block_size": 4128, 00:30:28.477 "num_blocks": 8192, 00:30:28.477 "uuid": "28aa3d21-aab9-4ea8-88da-169618df2157", 00:30:28.477 "md_size": 32, 00:30:28.477 "md_interleave": true, 00:30:28.477 "dif_type": 0, 00:30:28.477 "assigned_rate_limits": { 00:30:28.477 "rw_ios_per_sec": 0, 00:30:28.477 "rw_mbytes_per_sec": 0, 00:30:28.477 "r_mbytes_per_sec": 0, 00:30:28.477 "w_mbytes_per_sec": 0 00:30:28.477 }, 00:30:28.477 "claimed": true, 00:30:28.477 "claim_type": "exclusive_write", 00:30:28.477 "zoned": false, 00:30:28.477 "supported_io_types": { 00:30:28.477 "read": true, 00:30:28.477 "write": true, 00:30:28.477 "unmap": true, 00:30:28.477 "flush": true, 00:30:28.477 "reset": true, 00:30:28.477 "nvme_admin": false, 00:30:28.477 "nvme_io": false, 00:30:28.477 "nvme_io_md": false, 00:30:28.477 "write_zeroes": true, 00:30:28.477 "zcopy": true, 00:30:28.477 "get_zone_info": false, 00:30:28.477 "zone_management": false, 00:30:28.477 "zone_append": false, 00:30:28.477 "compare": false, 00:30:28.477 "compare_and_write": false, 00:30:28.477 "abort": true, 00:30:28.477 "seek_hole": false, 00:30:28.477 "seek_data": false, 00:30:28.477 "copy": true, 00:30:28.477 "nvme_iov_md": false 00:30:28.477 }, 00:30:28.477 "memory_domains": [ 00:30:28.477 { 00:30:28.477 "dma_device_id": "system", 00:30:28.477 "dma_device_type": 1 00:30:28.477 }, 00:30:28.477 { 00:30:28.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:28.477 "dma_device_type": 2 00:30:28.477 } 00:30:28.477 ], 00:30:28.477 "driver_specific": {} 00:30:28.477 } 00:30:28.477 ] 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:28.477 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:28.736 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:28.736 "name": "Existed_Raid", 00:30:28.736 "uuid": "9f796133-5570-4ebc-8a15-77b4dd06a238", 00:30:28.736 "strip_size_kb": 0, 00:30:28.736 "state": "online", 00:30:28.736 "raid_level": "raid1", 00:30:28.736 "superblock": true, 00:30:28.736 "num_base_bdevs": 2, 00:30:28.736 "num_base_bdevs_discovered": 2, 00:30:28.736 "num_base_bdevs_operational": 2, 00:30:28.736 "base_bdevs_list": [ 00:30:28.736 { 00:30:28.736 "name": "BaseBdev1", 00:30:28.736 "uuid": "0cd6b7c8-8939-44fe-8f78-548f616304a9", 00:30:28.736 "is_configured": true, 00:30:28.736 "data_offset": 256, 00:30:28.736 "data_size": 7936 00:30:28.736 }, 00:30:28.736 { 00:30:28.736 "name": "BaseBdev2", 00:30:28.736 "uuid": "28aa3d21-aab9-4ea8-88da-169618df2157", 00:30:28.736 "is_configured": true, 00:30:28.736 "data_offset": 256, 00:30:28.736 "data_size": 7936 00:30:28.736 } 00:30:28.736 ] 00:30:28.736 }' 00:30:28.736 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:28.736 05:58:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:29.303 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:30:29.303 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:30:29.303 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:29.562 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:29.562 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:29.562 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:29.562 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:30:29.562 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:29.562 [2024-07-26 05:58:44.438274] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:29.563 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:29.563 "name": "Existed_Raid", 00:30:29.563 "aliases": [ 00:30:29.563 "9f796133-5570-4ebc-8a15-77b4dd06a238" 00:30:29.563 ], 00:30:29.563 "product_name": "Raid Volume", 00:30:29.563 "block_size": 4128, 00:30:29.563 "num_blocks": 7936, 00:30:29.563 "uuid": "9f796133-5570-4ebc-8a15-77b4dd06a238", 00:30:29.563 "md_size": 32, 00:30:29.563 "md_interleave": true, 00:30:29.563 "dif_type": 0, 00:30:29.563 "assigned_rate_limits": { 00:30:29.563 "rw_ios_per_sec": 0, 00:30:29.563 "rw_mbytes_per_sec": 0, 00:30:29.563 "r_mbytes_per_sec": 0, 00:30:29.563 "w_mbytes_per_sec": 0 00:30:29.563 }, 00:30:29.563 "claimed": false, 00:30:29.563 "zoned": false, 00:30:29.563 "supported_io_types": { 00:30:29.563 "read": true, 00:30:29.563 "write": true, 00:30:29.563 "unmap": false, 00:30:29.563 "flush": false, 00:30:29.563 "reset": true, 00:30:29.563 "nvme_admin": false, 00:30:29.563 "nvme_io": false, 00:30:29.563 "nvme_io_md": false, 00:30:29.563 "write_zeroes": true, 00:30:29.563 "zcopy": false, 00:30:29.563 "get_zone_info": false, 00:30:29.563 "zone_management": false, 00:30:29.563 "zone_append": false, 00:30:29.563 "compare": false, 00:30:29.563 "compare_and_write": false, 00:30:29.563 "abort": false, 00:30:29.563 "seek_hole": false, 00:30:29.563 "seek_data": false, 00:30:29.563 "copy": false, 00:30:29.563 "nvme_iov_md": false 00:30:29.563 }, 00:30:29.563 "memory_domains": [ 00:30:29.563 { 00:30:29.563 "dma_device_id": "system", 00:30:29.563 "dma_device_type": 1 00:30:29.563 }, 00:30:29.563 { 00:30:29.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:29.563 "dma_device_type": 2 00:30:29.563 }, 00:30:29.563 { 00:30:29.563 "dma_device_id": "system", 00:30:29.563 "dma_device_type": 1 00:30:29.563 }, 00:30:29.563 { 00:30:29.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:29.563 "dma_device_type": 2 00:30:29.563 } 00:30:29.563 ], 00:30:29.563 "driver_specific": { 00:30:29.563 "raid": { 00:30:29.563 "uuid": "9f796133-5570-4ebc-8a15-77b4dd06a238", 00:30:29.563 "strip_size_kb": 0, 00:30:29.563 "state": "online", 00:30:29.563 "raid_level": "raid1", 00:30:29.563 "superblock": true, 00:30:29.563 "num_base_bdevs": 2, 00:30:29.563 "num_base_bdevs_discovered": 2, 00:30:29.563 "num_base_bdevs_operational": 2, 00:30:29.563 "base_bdevs_list": [ 00:30:29.563 { 00:30:29.563 "name": "BaseBdev1", 00:30:29.563 "uuid": "0cd6b7c8-8939-44fe-8f78-548f616304a9", 00:30:29.563 "is_configured": true, 00:30:29.563 "data_offset": 256, 00:30:29.563 "data_size": 7936 00:30:29.563 }, 00:30:29.563 { 00:30:29.563 "name": "BaseBdev2", 00:30:29.563 "uuid": "28aa3d21-aab9-4ea8-88da-169618df2157", 00:30:29.563 "is_configured": true, 00:30:29.563 "data_offset": 256, 00:30:29.563 "data_size": 7936 00:30:29.563 } 00:30:29.563 ] 00:30:29.563 } 00:30:29.563 } 00:30:29.563 }' 00:30:29.563 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:29.823 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:30:29.823 BaseBdev2' 00:30:29.823 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:29.823 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:29.823 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:30:29.823 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:29.823 "name": "BaseBdev1", 00:30:29.823 "aliases": [ 00:30:29.823 "0cd6b7c8-8939-44fe-8f78-548f616304a9" 00:30:29.823 ], 00:30:29.823 "product_name": "Malloc disk", 00:30:29.823 "block_size": 4128, 00:30:29.823 "num_blocks": 8192, 00:30:29.823 "uuid": "0cd6b7c8-8939-44fe-8f78-548f616304a9", 00:30:29.823 "md_size": 32, 00:30:29.823 "md_interleave": true, 00:30:29.823 "dif_type": 0, 00:30:29.823 "assigned_rate_limits": { 00:30:29.823 "rw_ios_per_sec": 0, 00:30:29.823 "rw_mbytes_per_sec": 0, 00:30:29.823 "r_mbytes_per_sec": 0, 00:30:29.823 "w_mbytes_per_sec": 0 00:30:29.823 }, 00:30:29.823 "claimed": true, 00:30:29.823 "claim_type": "exclusive_write", 00:30:29.823 "zoned": false, 00:30:29.823 "supported_io_types": { 00:30:29.823 "read": true, 00:30:29.823 "write": true, 00:30:29.823 "unmap": true, 00:30:29.823 "flush": true, 00:30:29.823 "reset": true, 00:30:29.823 "nvme_admin": false, 00:30:29.823 "nvme_io": false, 00:30:29.823 "nvme_io_md": false, 00:30:29.823 "write_zeroes": true, 00:30:29.823 "zcopy": true, 00:30:29.823 "get_zone_info": false, 00:30:29.823 "zone_management": false, 00:30:29.823 "zone_append": false, 00:30:29.823 "compare": false, 00:30:29.823 "compare_and_write": false, 00:30:29.823 "abort": true, 00:30:29.823 "seek_hole": false, 00:30:29.823 "seek_data": false, 00:30:29.823 "copy": true, 00:30:29.823 "nvme_iov_md": false 00:30:29.823 }, 00:30:29.823 "memory_domains": [ 00:30:29.823 { 00:30:29.823 "dma_device_id": "system", 00:30:29.823 "dma_device_type": 1 00:30:29.823 }, 00:30:29.823 { 00:30:29.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:29.823 "dma_device_type": 2 00:30:29.823 } 00:30:29.823 ], 00:30:29.823 "driver_specific": {} 00:30:29.823 }' 00:30:29.823 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:29.823 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:30.082 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:30.082 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.082 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.082 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:30.082 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.082 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.082 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:30.082 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.082 05:58:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.341 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:30.341 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:30.341 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:30:30.341 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:30.341 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:30.341 "name": "BaseBdev2", 00:30:30.341 "aliases": [ 00:30:30.341 "28aa3d21-aab9-4ea8-88da-169618df2157" 00:30:30.341 ], 00:30:30.341 "product_name": "Malloc disk", 00:30:30.341 "block_size": 4128, 00:30:30.341 "num_blocks": 8192, 00:30:30.341 "uuid": "28aa3d21-aab9-4ea8-88da-169618df2157", 00:30:30.341 "md_size": 32, 00:30:30.341 "md_interleave": true, 00:30:30.341 "dif_type": 0, 00:30:30.341 "assigned_rate_limits": { 00:30:30.341 "rw_ios_per_sec": 0, 00:30:30.341 "rw_mbytes_per_sec": 0, 00:30:30.341 "r_mbytes_per_sec": 0, 00:30:30.341 "w_mbytes_per_sec": 0 00:30:30.341 }, 00:30:30.341 "claimed": true, 00:30:30.341 "claim_type": "exclusive_write", 00:30:30.341 "zoned": false, 00:30:30.341 "supported_io_types": { 00:30:30.341 "read": true, 00:30:30.341 "write": true, 00:30:30.341 "unmap": true, 00:30:30.341 "flush": true, 00:30:30.341 "reset": true, 00:30:30.341 "nvme_admin": false, 00:30:30.341 "nvme_io": false, 00:30:30.341 "nvme_io_md": false, 00:30:30.341 "write_zeroes": true, 00:30:30.341 "zcopy": true, 00:30:30.341 "get_zone_info": false, 00:30:30.341 "zone_management": false, 00:30:30.341 "zone_append": false, 00:30:30.341 "compare": false, 00:30:30.341 "compare_and_write": false, 00:30:30.341 "abort": true, 00:30:30.341 "seek_hole": false, 00:30:30.341 "seek_data": false, 00:30:30.341 "copy": true, 00:30:30.341 "nvme_iov_md": false 00:30:30.341 }, 00:30:30.341 "memory_domains": [ 00:30:30.341 { 00:30:30.341 "dma_device_id": "system", 00:30:30.341 "dma_device_type": 1 00:30:30.341 }, 00:30:30.341 { 00:30:30.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:30.341 "dma_device_type": 2 00:30:30.341 } 00:30:30.341 ], 00:30:30.341 "driver_specific": {} 00:30:30.341 }' 00:30:30.341 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:30.341 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:30.600 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:30.600 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.600 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.600 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:30.600 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.600 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.600 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:30.600 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.860 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.860 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:30.860 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:30:31.119 [2024-07-26 05:58:45.781616] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:31.119 05:58:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:31.379 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:31.379 "name": "Existed_Raid", 00:30:31.379 "uuid": "9f796133-5570-4ebc-8a15-77b4dd06a238", 00:30:31.379 "strip_size_kb": 0, 00:30:31.379 "state": "online", 00:30:31.379 "raid_level": "raid1", 00:30:31.379 "superblock": true, 00:30:31.379 "num_base_bdevs": 2, 00:30:31.379 "num_base_bdevs_discovered": 1, 00:30:31.379 "num_base_bdevs_operational": 1, 00:30:31.379 "base_bdevs_list": [ 00:30:31.379 { 00:30:31.379 "name": null, 00:30:31.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:31.379 "is_configured": false, 00:30:31.379 "data_offset": 256, 00:30:31.379 "data_size": 7936 00:30:31.379 }, 00:30:31.379 { 00:30:31.379 "name": "BaseBdev2", 00:30:31.379 "uuid": "28aa3d21-aab9-4ea8-88da-169618df2157", 00:30:31.379 "is_configured": true, 00:30:31.379 "data_offset": 256, 00:30:31.379 "data_size": 7936 00:30:31.379 } 00:30:31.379 ] 00:30:31.379 }' 00:30:31.379 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:31.379 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:31.948 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:30:31.948 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:31.948 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:31.948 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:31.948 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:31.948 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:31.948 05:58:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:30:32.209 [2024-07-26 05:58:47.075020] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:32.209 [2024-07-26 05:58:47.075115] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:32.209 [2024-07-26 05:58:47.086282] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:32.209 [2024-07-26 05:58:47.086319] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:32.209 [2024-07-26 05:58:47.086331] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e5180 name Existed_Raid, state offline 00:30:32.209 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:32.209 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1282567 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1282567 ']' 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1282567 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:32.468 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1282567 00:30:32.727 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:32.727 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:32.727 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1282567' 00:30:32.727 killing process with pid 1282567 00:30:32.727 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1282567 00:30:32.727 [2024-07-26 05:58:47.414076] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:32.727 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1282567 00:30:32.727 [2024-07-26 05:58:47.415057] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:32.986 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:30:32.986 00:30:32.986 real 0m10.403s 00:30:32.986 user 0m18.449s 00:30:32.986 sys 0m1.984s 00:30:32.986 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:32.986 05:58:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:32.986 ************************************ 00:30:32.986 END TEST raid_state_function_test_sb_md_interleaved 00:30:32.986 ************************************ 00:30:32.986 05:58:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:32.986 05:58:47 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:30:32.986 05:58:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:30:32.986 05:58:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:32.986 05:58:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:32.986 ************************************ 00:30:32.986 START TEST raid_superblock_test_md_interleaved 00:30:32.986 ************************************ 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:30:32.986 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=1284030 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 1284030 /var/tmp/spdk-raid.sock 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1284030 ']' 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:32.987 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:32.987 05:58:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:32.987 [2024-07-26 05:58:47.796487] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:30:32.987 [2024-07-26 05:58:47.796552] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1284030 ] 00:30:33.245 [2024-07-26 05:58:47.929490] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:33.245 [2024-07-26 05:58:48.039071] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:33.245 [2024-07-26 05:58:48.105791] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:33.245 [2024-07-26 05:58:48.105820] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:34.182 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:34.183 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:30:34.183 malloc1 00:30:34.183 05:58:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:34.442 [2024-07-26 05:58:49.223279] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:34.442 [2024-07-26 05:58:49.223329] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:34.442 [2024-07-26 05:58:49.223351] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15c14e0 00:30:34.442 [2024-07-26 05:58:49.223364] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:34.442 [2024-07-26 05:58:49.224939] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:34.442 [2024-07-26 05:58:49.224968] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:34.442 pt1 00:30:34.442 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:34.442 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:34.442 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:30:34.442 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:30:34.442 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:30:34.442 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:34.442 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:34.442 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:34.442 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:30:34.700 malloc2 00:30:34.700 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:34.959 [2024-07-26 05:58:49.722938] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:34.959 [2024-07-26 05:58:49.722984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:34.959 [2024-07-26 05:58:49.723005] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15a6570 00:30:34.959 [2024-07-26 05:58:49.723017] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:34.959 [2024-07-26 05:58:49.724511] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:34.959 [2024-07-26 05:58:49.724542] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:34.959 pt2 00:30:34.959 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:34.959 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:34.959 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:30:35.218 [2024-07-26 05:58:49.967614] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:35.218 [2024-07-26 05:58:49.969129] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:35.218 [2024-07-26 05:58:49.969282] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15a7f20 00:30:35.218 [2024-07-26 05:58:49.969296] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:35.218 [2024-07-26 05:58:49.969364] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1424050 00:30:35.218 [2024-07-26 05:58:49.969449] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15a7f20 00:30:35.218 [2024-07-26 05:58:49.969460] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15a7f20 00:30:35.218 [2024-07-26 05:58:49.969520] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:35.218 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:35.219 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:35.219 05:58:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:35.477 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:35.477 "name": "raid_bdev1", 00:30:35.477 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:35.477 "strip_size_kb": 0, 00:30:35.477 "state": "online", 00:30:35.477 "raid_level": "raid1", 00:30:35.477 "superblock": true, 00:30:35.477 "num_base_bdevs": 2, 00:30:35.477 "num_base_bdevs_discovered": 2, 00:30:35.477 "num_base_bdevs_operational": 2, 00:30:35.477 "base_bdevs_list": [ 00:30:35.477 { 00:30:35.477 "name": "pt1", 00:30:35.477 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:35.477 "is_configured": true, 00:30:35.477 "data_offset": 256, 00:30:35.477 "data_size": 7936 00:30:35.477 }, 00:30:35.477 { 00:30:35.477 "name": "pt2", 00:30:35.477 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:35.477 "is_configured": true, 00:30:35.477 "data_offset": 256, 00:30:35.477 "data_size": 7936 00:30:35.477 } 00:30:35.477 ] 00:30:35.477 }' 00:30:35.477 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:35.477 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:36.043 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:30:36.043 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:36.043 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:36.043 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:36.043 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:36.043 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:36.043 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:36.043 05:58:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:36.302 [2024-07-26 05:58:51.070773] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:36.302 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:36.302 "name": "raid_bdev1", 00:30:36.302 "aliases": [ 00:30:36.302 "8773fb7d-d689-4a28-bc6d-f46314e70f2c" 00:30:36.302 ], 00:30:36.302 "product_name": "Raid Volume", 00:30:36.302 "block_size": 4128, 00:30:36.302 "num_blocks": 7936, 00:30:36.302 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:36.302 "md_size": 32, 00:30:36.302 "md_interleave": true, 00:30:36.302 "dif_type": 0, 00:30:36.302 "assigned_rate_limits": { 00:30:36.302 "rw_ios_per_sec": 0, 00:30:36.302 "rw_mbytes_per_sec": 0, 00:30:36.302 "r_mbytes_per_sec": 0, 00:30:36.302 "w_mbytes_per_sec": 0 00:30:36.302 }, 00:30:36.302 "claimed": false, 00:30:36.302 "zoned": false, 00:30:36.302 "supported_io_types": { 00:30:36.302 "read": true, 00:30:36.302 "write": true, 00:30:36.302 "unmap": false, 00:30:36.302 "flush": false, 00:30:36.302 "reset": true, 00:30:36.302 "nvme_admin": false, 00:30:36.302 "nvme_io": false, 00:30:36.302 "nvme_io_md": false, 00:30:36.302 "write_zeroes": true, 00:30:36.302 "zcopy": false, 00:30:36.302 "get_zone_info": false, 00:30:36.302 "zone_management": false, 00:30:36.302 "zone_append": false, 00:30:36.302 "compare": false, 00:30:36.302 "compare_and_write": false, 00:30:36.302 "abort": false, 00:30:36.302 "seek_hole": false, 00:30:36.302 "seek_data": false, 00:30:36.302 "copy": false, 00:30:36.302 "nvme_iov_md": false 00:30:36.302 }, 00:30:36.302 "memory_domains": [ 00:30:36.302 { 00:30:36.302 "dma_device_id": "system", 00:30:36.302 "dma_device_type": 1 00:30:36.302 }, 00:30:36.302 { 00:30:36.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.302 "dma_device_type": 2 00:30:36.302 }, 00:30:36.302 { 00:30:36.302 "dma_device_id": "system", 00:30:36.302 "dma_device_type": 1 00:30:36.302 }, 00:30:36.302 { 00:30:36.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.302 "dma_device_type": 2 00:30:36.302 } 00:30:36.302 ], 00:30:36.302 "driver_specific": { 00:30:36.302 "raid": { 00:30:36.302 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:36.302 "strip_size_kb": 0, 00:30:36.302 "state": "online", 00:30:36.302 "raid_level": "raid1", 00:30:36.302 "superblock": true, 00:30:36.302 "num_base_bdevs": 2, 00:30:36.302 "num_base_bdevs_discovered": 2, 00:30:36.302 "num_base_bdevs_operational": 2, 00:30:36.302 "base_bdevs_list": [ 00:30:36.302 { 00:30:36.302 "name": "pt1", 00:30:36.302 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:36.302 "is_configured": true, 00:30:36.302 "data_offset": 256, 00:30:36.302 "data_size": 7936 00:30:36.302 }, 00:30:36.302 { 00:30:36.302 "name": "pt2", 00:30:36.302 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:36.302 "is_configured": true, 00:30:36.302 "data_offset": 256, 00:30:36.302 "data_size": 7936 00:30:36.302 } 00:30:36.302 ] 00:30:36.302 } 00:30:36.302 } 00:30:36.302 }' 00:30:36.302 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:36.302 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:36.302 pt2' 00:30:36.302 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:36.302 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:36.302 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:36.562 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:36.562 "name": "pt1", 00:30:36.562 "aliases": [ 00:30:36.562 "00000000-0000-0000-0000-000000000001" 00:30:36.562 ], 00:30:36.562 "product_name": "passthru", 00:30:36.562 "block_size": 4128, 00:30:36.562 "num_blocks": 8192, 00:30:36.562 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:36.562 "md_size": 32, 00:30:36.562 "md_interleave": true, 00:30:36.562 "dif_type": 0, 00:30:36.562 "assigned_rate_limits": { 00:30:36.562 "rw_ios_per_sec": 0, 00:30:36.562 "rw_mbytes_per_sec": 0, 00:30:36.562 "r_mbytes_per_sec": 0, 00:30:36.562 "w_mbytes_per_sec": 0 00:30:36.562 }, 00:30:36.562 "claimed": true, 00:30:36.562 "claim_type": "exclusive_write", 00:30:36.562 "zoned": false, 00:30:36.562 "supported_io_types": { 00:30:36.562 "read": true, 00:30:36.562 "write": true, 00:30:36.562 "unmap": true, 00:30:36.562 "flush": true, 00:30:36.562 "reset": true, 00:30:36.562 "nvme_admin": false, 00:30:36.562 "nvme_io": false, 00:30:36.562 "nvme_io_md": false, 00:30:36.562 "write_zeroes": true, 00:30:36.562 "zcopy": true, 00:30:36.562 "get_zone_info": false, 00:30:36.562 "zone_management": false, 00:30:36.562 "zone_append": false, 00:30:36.562 "compare": false, 00:30:36.562 "compare_and_write": false, 00:30:36.562 "abort": true, 00:30:36.562 "seek_hole": false, 00:30:36.562 "seek_data": false, 00:30:36.562 "copy": true, 00:30:36.562 "nvme_iov_md": false 00:30:36.562 }, 00:30:36.562 "memory_domains": [ 00:30:36.562 { 00:30:36.562 "dma_device_id": "system", 00:30:36.562 "dma_device_type": 1 00:30:36.562 }, 00:30:36.562 { 00:30:36.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.562 "dma_device_type": 2 00:30:36.562 } 00:30:36.562 ], 00:30:36.562 "driver_specific": { 00:30:36.562 "passthru": { 00:30:36.562 "name": "pt1", 00:30:36.562 "base_bdev_name": "malloc1" 00:30:36.562 } 00:30:36.562 } 00:30:36.562 }' 00:30:36.562 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:36.562 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:36.822 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:36.822 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:36.822 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:36.822 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:36.822 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:36.822 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:36.822 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:36.822 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:36.822 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:37.081 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:37.081 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:37.081 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:37.081 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:37.081 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:37.081 "name": "pt2", 00:30:37.081 "aliases": [ 00:30:37.081 "00000000-0000-0000-0000-000000000002" 00:30:37.081 ], 00:30:37.081 "product_name": "passthru", 00:30:37.081 "block_size": 4128, 00:30:37.081 "num_blocks": 8192, 00:30:37.081 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:37.081 "md_size": 32, 00:30:37.081 "md_interleave": true, 00:30:37.081 "dif_type": 0, 00:30:37.081 "assigned_rate_limits": { 00:30:37.081 "rw_ios_per_sec": 0, 00:30:37.081 "rw_mbytes_per_sec": 0, 00:30:37.081 "r_mbytes_per_sec": 0, 00:30:37.081 "w_mbytes_per_sec": 0 00:30:37.081 }, 00:30:37.081 "claimed": true, 00:30:37.081 "claim_type": "exclusive_write", 00:30:37.081 "zoned": false, 00:30:37.081 "supported_io_types": { 00:30:37.081 "read": true, 00:30:37.081 "write": true, 00:30:37.081 "unmap": true, 00:30:37.081 "flush": true, 00:30:37.081 "reset": true, 00:30:37.081 "nvme_admin": false, 00:30:37.081 "nvme_io": false, 00:30:37.081 "nvme_io_md": false, 00:30:37.081 "write_zeroes": true, 00:30:37.081 "zcopy": true, 00:30:37.081 "get_zone_info": false, 00:30:37.081 "zone_management": false, 00:30:37.081 "zone_append": false, 00:30:37.081 "compare": false, 00:30:37.081 "compare_and_write": false, 00:30:37.081 "abort": true, 00:30:37.081 "seek_hole": false, 00:30:37.081 "seek_data": false, 00:30:37.081 "copy": true, 00:30:37.081 "nvme_iov_md": false 00:30:37.081 }, 00:30:37.081 "memory_domains": [ 00:30:37.081 { 00:30:37.081 "dma_device_id": "system", 00:30:37.081 "dma_device_type": 1 00:30:37.081 }, 00:30:37.081 { 00:30:37.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:37.081 "dma_device_type": 2 00:30:37.081 } 00:30:37.081 ], 00:30:37.081 "driver_specific": { 00:30:37.081 "passthru": { 00:30:37.081 "name": "pt2", 00:30:37.081 "base_bdev_name": "malloc2" 00:30:37.081 } 00:30:37.081 } 00:30:37.081 }' 00:30:37.081 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:37.081 05:58:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:37.340 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:37.340 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:37.340 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:37.340 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:37.340 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:37.340 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:37.340 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:37.340 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:37.340 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:37.599 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:37.599 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:37.599 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:30:37.599 [2024-07-26 05:58:52.498533] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:37.857 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8773fb7d-d689-4a28-bc6d-f46314e70f2c 00:30:37.858 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 8773fb7d-d689-4a28-bc6d-f46314e70f2c ']' 00:30:37.858 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:37.858 [2024-07-26 05:58:52.742923] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:37.858 [2024-07-26 05:58:52.742947] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:37.858 [2024-07-26 05:58:52.743002] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:37.858 [2024-07-26 05:58:52.743057] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:37.858 [2024-07-26 05:58:52.743069] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15a7f20 name raid_bdev1, state offline 00:30:38.116 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:38.116 05:58:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:30:38.116 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:30:38.117 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:30:38.117 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:38.117 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:38.376 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:38.376 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:38.635 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:30:38.635 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:38.894 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:39.152 [2024-07-26 05:58:53.966091] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:30:39.152 [2024-07-26 05:58:53.967447] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:30:39.152 [2024-07-26 05:58:53.967501] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:30:39.152 [2024-07-26 05:58:53.967540] bdev_raid.c:3196:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:30:39.152 [2024-07-26 05:58:53.967558] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:39.152 [2024-07-26 05:58:53.967568] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15b2260 name raid_bdev1, state configuring 00:30:39.152 request: 00:30:39.152 { 00:30:39.152 "name": "raid_bdev1", 00:30:39.152 "raid_level": "raid1", 00:30:39.152 "base_bdevs": [ 00:30:39.152 "malloc1", 00:30:39.152 "malloc2" 00:30:39.152 ], 00:30:39.152 "superblock": false, 00:30:39.152 "method": "bdev_raid_create", 00:30:39.152 "req_id": 1 00:30:39.152 } 00:30:39.152 Got JSON-RPC error response 00:30:39.152 response: 00:30:39.152 { 00:30:39.152 "code": -17, 00:30:39.152 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:30:39.152 } 00:30:39.152 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:30:39.152 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:30:39.152 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:30:39.152 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:30:39.152 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:39.152 05:58:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:30:39.410 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:30:39.410 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:30:39.410 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:39.668 [2024-07-26 05:58:54.459329] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:39.668 [2024-07-26 05:58:54.459370] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:39.668 [2024-07-26 05:58:54.459386] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15a9000 00:30:39.668 [2024-07-26 05:58:54.459399] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:39.668 [2024-07-26 05:58:54.460790] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:39.668 [2024-07-26 05:58:54.460815] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:39.668 [2024-07-26 05:58:54.460858] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:39.668 [2024-07-26 05:58:54.460881] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:39.668 pt1 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:39.668 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:39.926 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:39.926 "name": "raid_bdev1", 00:30:39.926 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:39.926 "strip_size_kb": 0, 00:30:39.926 "state": "configuring", 00:30:39.926 "raid_level": "raid1", 00:30:39.926 "superblock": true, 00:30:39.926 "num_base_bdevs": 2, 00:30:39.926 "num_base_bdevs_discovered": 1, 00:30:39.926 "num_base_bdevs_operational": 2, 00:30:39.926 "base_bdevs_list": [ 00:30:39.926 { 00:30:39.926 "name": "pt1", 00:30:39.926 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:39.926 "is_configured": true, 00:30:39.926 "data_offset": 256, 00:30:39.926 "data_size": 7936 00:30:39.926 }, 00:30:39.926 { 00:30:39.926 "name": null, 00:30:39.926 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:39.926 "is_configured": false, 00:30:39.926 "data_offset": 256, 00:30:39.926 "data_size": 7936 00:30:39.926 } 00:30:39.926 ] 00:30:39.926 }' 00:30:39.926 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:39.926 05:58:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:40.494 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:30:40.494 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:30:40.494 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:40.494 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:40.808 [2024-07-26 05:58:55.502162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:40.808 [2024-07-26 05:58:55.502217] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:40.808 [2024-07-26 05:58:55.502237] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15ab270 00:30:40.808 [2024-07-26 05:58:55.502250] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:40.808 [2024-07-26 05:58:55.502413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:40.808 [2024-07-26 05:58:55.502429] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:40.808 [2024-07-26 05:58:55.502469] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:40.808 [2024-07-26 05:58:55.502486] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:40.808 [2024-07-26 05:58:55.502565] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x1424c10 00:30:40.808 [2024-07-26 05:58:55.502575] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:40.808 [2024-07-26 05:58:55.502628] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a6d40 00:30:40.808 [2024-07-26 05:58:55.502707] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1424c10 00:30:40.808 [2024-07-26 05:58:55.502717] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1424c10 00:30:40.808 [2024-07-26 05:58:55.502773] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:40.808 pt2 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:40.808 "name": "raid_bdev1", 00:30:40.808 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:40.808 "strip_size_kb": 0, 00:30:40.808 "state": "online", 00:30:40.808 "raid_level": "raid1", 00:30:40.808 "superblock": true, 00:30:40.808 "num_base_bdevs": 2, 00:30:40.808 "num_base_bdevs_discovered": 2, 00:30:40.808 "num_base_bdevs_operational": 2, 00:30:40.808 "base_bdevs_list": [ 00:30:40.808 { 00:30:40.808 "name": "pt1", 00:30:40.808 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:40.808 "is_configured": true, 00:30:40.808 "data_offset": 256, 00:30:40.808 "data_size": 7936 00:30:40.808 }, 00:30:40.808 { 00:30:40.808 "name": "pt2", 00:30:40.808 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:40.808 "is_configured": true, 00:30:40.808 "data_offset": 256, 00:30:40.808 "data_size": 7936 00:30:40.808 } 00:30:40.808 ] 00:30:40.808 }' 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:40.808 05:58:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:41.745 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:30:41.745 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:41.745 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:41.745 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:41.745 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:41.745 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:41.745 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:41.745 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:41.745 [2024-07-26 05:58:56.525120] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:41.745 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:41.745 "name": "raid_bdev1", 00:30:41.745 "aliases": [ 00:30:41.745 "8773fb7d-d689-4a28-bc6d-f46314e70f2c" 00:30:41.745 ], 00:30:41.746 "product_name": "Raid Volume", 00:30:41.746 "block_size": 4128, 00:30:41.746 "num_blocks": 7936, 00:30:41.746 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:41.746 "md_size": 32, 00:30:41.746 "md_interleave": true, 00:30:41.746 "dif_type": 0, 00:30:41.746 "assigned_rate_limits": { 00:30:41.746 "rw_ios_per_sec": 0, 00:30:41.746 "rw_mbytes_per_sec": 0, 00:30:41.746 "r_mbytes_per_sec": 0, 00:30:41.746 "w_mbytes_per_sec": 0 00:30:41.746 }, 00:30:41.746 "claimed": false, 00:30:41.746 "zoned": false, 00:30:41.746 "supported_io_types": { 00:30:41.746 "read": true, 00:30:41.746 "write": true, 00:30:41.746 "unmap": false, 00:30:41.746 "flush": false, 00:30:41.746 "reset": true, 00:30:41.746 "nvme_admin": false, 00:30:41.746 "nvme_io": false, 00:30:41.746 "nvme_io_md": false, 00:30:41.746 "write_zeroes": true, 00:30:41.746 "zcopy": false, 00:30:41.746 "get_zone_info": false, 00:30:41.746 "zone_management": false, 00:30:41.746 "zone_append": false, 00:30:41.746 "compare": false, 00:30:41.746 "compare_and_write": false, 00:30:41.746 "abort": false, 00:30:41.746 "seek_hole": false, 00:30:41.746 "seek_data": false, 00:30:41.746 "copy": false, 00:30:41.746 "nvme_iov_md": false 00:30:41.746 }, 00:30:41.746 "memory_domains": [ 00:30:41.746 { 00:30:41.746 "dma_device_id": "system", 00:30:41.746 "dma_device_type": 1 00:30:41.746 }, 00:30:41.746 { 00:30:41.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:41.746 "dma_device_type": 2 00:30:41.746 }, 00:30:41.746 { 00:30:41.746 "dma_device_id": "system", 00:30:41.746 "dma_device_type": 1 00:30:41.746 }, 00:30:41.746 { 00:30:41.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:41.746 "dma_device_type": 2 00:30:41.746 } 00:30:41.746 ], 00:30:41.746 "driver_specific": { 00:30:41.746 "raid": { 00:30:41.746 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:41.746 "strip_size_kb": 0, 00:30:41.746 "state": "online", 00:30:41.746 "raid_level": "raid1", 00:30:41.746 "superblock": true, 00:30:41.746 "num_base_bdevs": 2, 00:30:41.746 "num_base_bdevs_discovered": 2, 00:30:41.746 "num_base_bdevs_operational": 2, 00:30:41.746 "base_bdevs_list": [ 00:30:41.746 { 00:30:41.746 "name": "pt1", 00:30:41.746 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:41.746 "is_configured": true, 00:30:41.746 "data_offset": 256, 00:30:41.746 "data_size": 7936 00:30:41.746 }, 00:30:41.746 { 00:30:41.746 "name": "pt2", 00:30:41.746 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:41.746 "is_configured": true, 00:30:41.746 "data_offset": 256, 00:30:41.746 "data_size": 7936 00:30:41.746 } 00:30:41.746 ] 00:30:41.746 } 00:30:41.746 } 00:30:41.746 }' 00:30:41.746 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:41.746 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:41.746 pt2' 00:30:41.746 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:41.746 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:41.746 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:42.006 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:42.006 "name": "pt1", 00:30:42.006 "aliases": [ 00:30:42.006 "00000000-0000-0000-0000-000000000001" 00:30:42.006 ], 00:30:42.006 "product_name": "passthru", 00:30:42.006 "block_size": 4128, 00:30:42.006 "num_blocks": 8192, 00:30:42.006 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:42.006 "md_size": 32, 00:30:42.006 "md_interleave": true, 00:30:42.006 "dif_type": 0, 00:30:42.006 "assigned_rate_limits": { 00:30:42.006 "rw_ios_per_sec": 0, 00:30:42.006 "rw_mbytes_per_sec": 0, 00:30:42.006 "r_mbytes_per_sec": 0, 00:30:42.006 "w_mbytes_per_sec": 0 00:30:42.006 }, 00:30:42.006 "claimed": true, 00:30:42.006 "claim_type": "exclusive_write", 00:30:42.006 "zoned": false, 00:30:42.006 "supported_io_types": { 00:30:42.006 "read": true, 00:30:42.006 "write": true, 00:30:42.006 "unmap": true, 00:30:42.006 "flush": true, 00:30:42.006 "reset": true, 00:30:42.006 "nvme_admin": false, 00:30:42.006 "nvme_io": false, 00:30:42.006 "nvme_io_md": false, 00:30:42.006 "write_zeroes": true, 00:30:42.006 "zcopy": true, 00:30:42.006 "get_zone_info": false, 00:30:42.006 "zone_management": false, 00:30:42.006 "zone_append": false, 00:30:42.006 "compare": false, 00:30:42.006 "compare_and_write": false, 00:30:42.006 "abort": true, 00:30:42.006 "seek_hole": false, 00:30:42.006 "seek_data": false, 00:30:42.006 "copy": true, 00:30:42.006 "nvme_iov_md": false 00:30:42.006 }, 00:30:42.006 "memory_domains": [ 00:30:42.006 { 00:30:42.006 "dma_device_id": "system", 00:30:42.006 "dma_device_type": 1 00:30:42.006 }, 00:30:42.006 { 00:30:42.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:42.006 "dma_device_type": 2 00:30:42.006 } 00:30:42.006 ], 00:30:42.006 "driver_specific": { 00:30:42.006 "passthru": { 00:30:42.006 "name": "pt1", 00:30:42.006 "base_bdev_name": "malloc1" 00:30:42.006 } 00:30:42.006 } 00:30:42.006 }' 00:30:42.006 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:42.006 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:42.265 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:42.265 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:42.265 05:58:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:42.265 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:42.265 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:42.265 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:42.265 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:42.265 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:42.265 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:42.524 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:42.524 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:42.524 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:42.524 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:42.783 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:42.783 "name": "pt2", 00:30:42.783 "aliases": [ 00:30:42.783 "00000000-0000-0000-0000-000000000002" 00:30:42.783 ], 00:30:42.783 "product_name": "passthru", 00:30:42.783 "block_size": 4128, 00:30:42.783 "num_blocks": 8192, 00:30:42.783 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:42.783 "md_size": 32, 00:30:42.783 "md_interleave": true, 00:30:42.783 "dif_type": 0, 00:30:42.783 "assigned_rate_limits": { 00:30:42.783 "rw_ios_per_sec": 0, 00:30:42.783 "rw_mbytes_per_sec": 0, 00:30:42.783 "r_mbytes_per_sec": 0, 00:30:42.783 "w_mbytes_per_sec": 0 00:30:42.783 }, 00:30:42.783 "claimed": true, 00:30:42.783 "claim_type": "exclusive_write", 00:30:42.783 "zoned": false, 00:30:42.783 "supported_io_types": { 00:30:42.783 "read": true, 00:30:42.783 "write": true, 00:30:42.783 "unmap": true, 00:30:42.783 "flush": true, 00:30:42.783 "reset": true, 00:30:42.783 "nvme_admin": false, 00:30:42.783 "nvme_io": false, 00:30:42.783 "nvme_io_md": false, 00:30:42.783 "write_zeroes": true, 00:30:42.783 "zcopy": true, 00:30:42.783 "get_zone_info": false, 00:30:42.783 "zone_management": false, 00:30:42.783 "zone_append": false, 00:30:42.783 "compare": false, 00:30:42.783 "compare_and_write": false, 00:30:42.783 "abort": true, 00:30:42.783 "seek_hole": false, 00:30:42.783 "seek_data": false, 00:30:42.783 "copy": true, 00:30:42.783 "nvme_iov_md": false 00:30:42.783 }, 00:30:42.783 "memory_domains": [ 00:30:42.783 { 00:30:42.783 "dma_device_id": "system", 00:30:42.783 "dma_device_type": 1 00:30:42.783 }, 00:30:42.783 { 00:30:42.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:42.784 "dma_device_type": 2 00:30:42.784 } 00:30:42.784 ], 00:30:42.784 "driver_specific": { 00:30:42.784 "passthru": { 00:30:42.784 "name": "pt2", 00:30:42.784 "base_bdev_name": "malloc2" 00:30:42.784 } 00:30:42.784 } 00:30:42.784 }' 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:42.784 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:43.043 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:43.043 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:43.043 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:43.043 05:58:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:30:43.302 [2024-07-26 05:58:57.997026] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:43.302 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 8773fb7d-d689-4a28-bc6d-f46314e70f2c '!=' 8773fb7d-d689-4a28-bc6d-f46314e70f2c ']' 00:30:43.302 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:30:43.302 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:43.302 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:30:43.302 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:43.562 [2024-07-26 05:58:58.245452] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:43.562 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:43.821 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:43.821 "name": "raid_bdev1", 00:30:43.821 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:43.821 "strip_size_kb": 0, 00:30:43.821 "state": "online", 00:30:43.821 "raid_level": "raid1", 00:30:43.821 "superblock": true, 00:30:43.821 "num_base_bdevs": 2, 00:30:43.821 "num_base_bdevs_discovered": 1, 00:30:43.821 "num_base_bdevs_operational": 1, 00:30:43.821 "base_bdevs_list": [ 00:30:43.821 { 00:30:43.821 "name": null, 00:30:43.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:43.821 "is_configured": false, 00:30:43.821 "data_offset": 256, 00:30:43.821 "data_size": 7936 00:30:43.821 }, 00:30:43.821 { 00:30:43.821 "name": "pt2", 00:30:43.821 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:43.821 "is_configured": true, 00:30:43.821 "data_offset": 256, 00:30:43.821 "data_size": 7936 00:30:43.821 } 00:30:43.821 ] 00:30:43.821 }' 00:30:43.821 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:43.821 05:58:58 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:44.389 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:44.648 [2024-07-26 05:58:59.312260] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:44.648 [2024-07-26 05:58:59.312286] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:44.649 [2024-07-26 05:58:59.312341] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:44.649 [2024-07-26 05:58:59.312388] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:44.649 [2024-07-26 05:58:59.312399] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1424c10 name raid_bdev1, state offline 00:30:44.649 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:44.649 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:30:44.908 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:30:44.908 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:30:44.908 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:30:44.908 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:30:44.908 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:45.167 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:30:45.167 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:30:45.167 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:30:45.167 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:30:45.167 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:30:45.167 05:58:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:45.167 [2024-07-26 05:59:00.058207] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:45.167 [2024-07-26 05:59:00.058259] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:45.167 [2024-07-26 05:59:00.058281] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15a99f0 00:30:45.167 [2024-07-26 05:59:00.058294] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:45.167 [2024-07-26 05:59:00.059741] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:45.167 [2024-07-26 05:59:00.059776] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:45.167 [2024-07-26 05:59:00.059824] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:45.167 [2024-07-26 05:59:00.059850] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:45.167 [2024-07-26 05:59:00.059925] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15aaea0 00:30:45.167 [2024-07-26 05:59:00.059936] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:45.167 [2024-07-26 05:59:00.059995] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15a8bc0 00:30:45.167 [2024-07-26 05:59:00.060066] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15aaea0 00:30:45.167 [2024-07-26 05:59:00.060076] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15aaea0 00:30:45.167 [2024-07-26 05:59:00.060130] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:45.167 pt2 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:45.427 "name": "raid_bdev1", 00:30:45.427 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:45.427 "strip_size_kb": 0, 00:30:45.427 "state": "online", 00:30:45.427 "raid_level": "raid1", 00:30:45.427 "superblock": true, 00:30:45.427 "num_base_bdevs": 2, 00:30:45.427 "num_base_bdevs_discovered": 1, 00:30:45.427 "num_base_bdevs_operational": 1, 00:30:45.427 "base_bdevs_list": [ 00:30:45.427 { 00:30:45.427 "name": null, 00:30:45.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:45.427 "is_configured": false, 00:30:45.427 "data_offset": 256, 00:30:45.427 "data_size": 7936 00:30:45.427 }, 00:30:45.427 { 00:30:45.427 "name": "pt2", 00:30:45.427 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:45.427 "is_configured": true, 00:30:45.427 "data_offset": 256, 00:30:45.427 "data_size": 7936 00:30:45.427 } 00:30:45.427 ] 00:30:45.427 }' 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:45.427 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:46.364 05:59:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:46.364 [2024-07-26 05:59:01.153096] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:46.364 [2024-07-26 05:59:01.153124] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:46.364 [2024-07-26 05:59:01.153181] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:46.364 [2024-07-26 05:59:01.153228] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:46.364 [2024-07-26 05:59:01.153239] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15aaea0 name raid_bdev1, state offline 00:30:46.364 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:46.364 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:30:46.623 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:30:46.623 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:30:46.623 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:30:46.623 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:46.883 [2024-07-26 05:59:01.646375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:46.883 [2024-07-26 05:59:01.646417] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:46.883 [2024-07-26 05:59:01.646434] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15a9620 00:30:46.883 [2024-07-26 05:59:01.646447] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:46.883 [2024-07-26 05:59:01.647868] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:46.883 [2024-07-26 05:59:01.647894] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:46.883 [2024-07-26 05:59:01.647940] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:46.883 [2024-07-26 05:59:01.647963] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:46.883 [2024-07-26 05:59:01.648038] bdev_raid.c:3639:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:30:46.883 [2024-07-26 05:59:01.648050] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:46.883 [2024-07-26 05:59:01.648065] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15ab640 name raid_bdev1, state configuring 00:30:46.883 [2024-07-26 05:59:01.648088] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:46.883 [2024-07-26 05:59:01.648138] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x15ab640 00:30:46.883 [2024-07-26 05:59:01.648149] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:46.883 [2024-07-26 05:59:01.648200] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15aa810 00:30:46.883 [2024-07-26 05:59:01.648270] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15ab640 00:30:46.883 [2024-07-26 05:59:01.648279] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15ab640 00:30:46.883 [2024-07-26 05:59:01.648336] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:46.883 pt1 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:46.883 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:47.141 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:47.141 "name": "raid_bdev1", 00:30:47.141 "uuid": "8773fb7d-d689-4a28-bc6d-f46314e70f2c", 00:30:47.141 "strip_size_kb": 0, 00:30:47.141 "state": "online", 00:30:47.141 "raid_level": "raid1", 00:30:47.141 "superblock": true, 00:30:47.141 "num_base_bdevs": 2, 00:30:47.141 "num_base_bdevs_discovered": 1, 00:30:47.141 "num_base_bdevs_operational": 1, 00:30:47.141 "base_bdevs_list": [ 00:30:47.141 { 00:30:47.141 "name": null, 00:30:47.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:47.141 "is_configured": false, 00:30:47.141 "data_offset": 256, 00:30:47.141 "data_size": 7936 00:30:47.141 }, 00:30:47.141 { 00:30:47.141 "name": "pt2", 00:30:47.141 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:47.141 "is_configured": true, 00:30:47.141 "data_offset": 256, 00:30:47.141 "data_size": 7936 00:30:47.141 } 00:30:47.141 ] 00:30:47.141 }' 00:30:47.141 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:47.141 05:59:01 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:47.708 05:59:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:30:47.708 05:59:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:30:47.967 05:59:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:30:47.967 05:59:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:47.967 05:59:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:30:48.226 [2024-07-26 05:59:02.986165] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 8773fb7d-d689-4a28-bc6d-f46314e70f2c '!=' 8773fb7d-d689-4a28-bc6d-f46314e70f2c ']' 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 1284030 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1284030 ']' 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1284030 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1284030 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1284030' 00:30:48.226 killing process with pid 1284030 00:30:48.226 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 1284030 00:30:48.226 [2024-07-26 05:59:03.052595] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:48.227 [2024-07-26 05:59:03.052654] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:48.227 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 1284030 00:30:48.227 [2024-07-26 05:59:03.052700] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:48.227 [2024-07-26 05:59:03.052712] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15ab640 name raid_bdev1, state offline 00:30:48.227 [2024-07-26 05:59:03.069374] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:48.486 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:30:48.486 00:30:48.486 real 0m15.541s 00:30:48.486 user 0m28.140s 00:30:48.486 sys 0m2.947s 00:30:48.486 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:48.486 05:59:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:48.486 ************************************ 00:30:48.486 END TEST raid_superblock_test_md_interleaved 00:30:48.486 ************************************ 00:30:48.486 05:59:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:48.486 05:59:03 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:30:48.486 05:59:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:30:48.486 05:59:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:48.486 05:59:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:48.486 ************************************ 00:30:48.486 START TEST raid_rebuild_test_sb_md_interleaved 00:30:48.486 ************************************ 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:30:48.486 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=1286449 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 1286449 /var/tmp/spdk-raid.sock 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1286449 ']' 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:48.487 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:48.487 05:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:48.746 [2024-07-26 05:59:03.425482] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:30:48.746 [2024-07-26 05:59:03.425531] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1286449 ] 00:30:48.746 I/O size of 3145728 is greater than zero copy threshold (65536). 00:30:48.746 Zero copy mechanism will not be used. 00:30:48.746 [2024-07-26 05:59:03.527645] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:48.746 [2024-07-26 05:59:03.631131] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:49.005 [2024-07-26 05:59:03.693673] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:49.005 [2024-07-26 05:59:03.693711] bdev_raid.c:1442:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:49.574 05:59:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:49.574 05:59:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:30:49.574 05:59:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:49.574 05:59:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:30:49.834 BaseBdev1_malloc 00:30:49.834 05:59:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:50.093 [2024-07-26 05:59:04.849863] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:50.093 [2024-07-26 05:59:04.849910] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:50.093 [2024-07-26 05:59:04.849936] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e2ce0 00:30:50.093 [2024-07-26 05:59:04.849948] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:50.093 [2024-07-26 05:59:04.851392] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:50.093 [2024-07-26 05:59:04.851421] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:50.093 BaseBdev1 00:30:50.093 05:59:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:50.093 05:59:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:30:50.352 BaseBdev2_malloc 00:30:50.352 05:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:30:50.611 [2024-07-26 05:59:05.352466] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:30:50.611 [2024-07-26 05:59:05.352514] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:50.611 [2024-07-26 05:59:05.352536] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23da2d0 00:30:50.611 [2024-07-26 05:59:05.352555] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:50.611 [2024-07-26 05:59:05.354292] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:50.611 [2024-07-26 05:59:05.354322] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:50.611 BaseBdev2 00:30:50.611 05:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:30:50.870 spare_malloc 00:30:50.870 05:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:30:51.128 spare_delay 00:30:51.128 05:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:51.388 [2024-07-26 05:59:06.091268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:51.388 [2024-07-26 05:59:06.091314] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:51.388 [2024-07-26 05:59:06.091335] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23dd070 00:30:51.388 [2024-07-26 05:59:06.091348] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:51.388 [2024-07-26 05:59:06.092665] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:51.388 [2024-07-26 05:59:06.092693] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:51.388 spare 00:30:51.388 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:30:51.647 [2024-07-26 05:59:06.339953] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:51.647 [2024-07-26 05:59:06.341174] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:51.647 [2024-07-26 05:59:06.341342] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23df370 00:30:51.647 [2024-07-26 05:59:06.341355] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:51.647 [2024-07-26 05:59:06.341432] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22459c0 00:30:51.647 [2024-07-26 05:59:06.341517] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23df370 00:30:51.647 [2024-07-26 05:59:06.341526] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23df370 00:30:51.647 [2024-07-26 05:59:06.341579] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:51.647 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:51.906 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:51.906 "name": "raid_bdev1", 00:30:51.906 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:30:51.906 "strip_size_kb": 0, 00:30:51.906 "state": "online", 00:30:51.906 "raid_level": "raid1", 00:30:51.906 "superblock": true, 00:30:51.906 "num_base_bdevs": 2, 00:30:51.906 "num_base_bdevs_discovered": 2, 00:30:51.906 "num_base_bdevs_operational": 2, 00:30:51.906 "base_bdevs_list": [ 00:30:51.906 { 00:30:51.906 "name": "BaseBdev1", 00:30:51.906 "uuid": "46199516-b658-5a3c-b015-81cc7ff427fb", 00:30:51.906 "is_configured": true, 00:30:51.906 "data_offset": 256, 00:30:51.906 "data_size": 7936 00:30:51.906 }, 00:30:51.906 { 00:30:51.906 "name": "BaseBdev2", 00:30:51.906 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:30:51.906 "is_configured": true, 00:30:51.906 "data_offset": 256, 00:30:51.906 "data_size": 7936 00:30:51.906 } 00:30:51.906 ] 00:30:51.906 }' 00:30:51.906 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:51.906 05:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:52.475 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:52.475 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:30:52.734 [2024-07-26 05:59:07.427060] bdev_raid.c:1119:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:52.734 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:30:52.734 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:52.734 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:30:52.993 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:30:52.993 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:30:52.993 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:30:52.993 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:30:53.252 [2024-07-26 05:59:07.916085] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:53.252 05:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:53.511 05:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:53.511 "name": "raid_bdev1", 00:30:53.511 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:30:53.511 "strip_size_kb": 0, 00:30:53.511 "state": "online", 00:30:53.511 "raid_level": "raid1", 00:30:53.511 "superblock": true, 00:30:53.511 "num_base_bdevs": 2, 00:30:53.511 "num_base_bdevs_discovered": 1, 00:30:53.511 "num_base_bdevs_operational": 1, 00:30:53.511 "base_bdevs_list": [ 00:30:53.511 { 00:30:53.511 "name": null, 00:30:53.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:53.511 "is_configured": false, 00:30:53.511 "data_offset": 256, 00:30:53.511 "data_size": 7936 00:30:53.511 }, 00:30:53.511 { 00:30:53.511 "name": "BaseBdev2", 00:30:53.511 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:30:53.511 "is_configured": true, 00:30:53.511 "data_offset": 256, 00:30:53.511 "data_size": 7936 00:30:53.511 } 00:30:53.511 ] 00:30:53.511 }' 00:30:53.511 05:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:53.511 05:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:54.079 05:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:54.338 [2024-07-26 05:59:09.002988] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:54.338 [2024-07-26 05:59:09.006627] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23df280 00:30:54.338 [2024-07-26 05:59:09.008680] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:54.338 05:59:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:30:55.274 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:55.274 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:55.274 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:55.274 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:55.274 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:55.274 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:55.274 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:55.533 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:55.533 "name": "raid_bdev1", 00:30:55.533 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:30:55.533 "strip_size_kb": 0, 00:30:55.533 "state": "online", 00:30:55.533 "raid_level": "raid1", 00:30:55.533 "superblock": true, 00:30:55.533 "num_base_bdevs": 2, 00:30:55.533 "num_base_bdevs_discovered": 2, 00:30:55.533 "num_base_bdevs_operational": 2, 00:30:55.533 "process": { 00:30:55.533 "type": "rebuild", 00:30:55.533 "target": "spare", 00:30:55.533 "progress": { 00:30:55.533 "blocks": 3072, 00:30:55.533 "percent": 38 00:30:55.533 } 00:30:55.533 }, 00:30:55.533 "base_bdevs_list": [ 00:30:55.533 { 00:30:55.533 "name": "spare", 00:30:55.533 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:30:55.533 "is_configured": true, 00:30:55.533 "data_offset": 256, 00:30:55.533 "data_size": 7936 00:30:55.533 }, 00:30:55.533 { 00:30:55.533 "name": "BaseBdev2", 00:30:55.533 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:30:55.533 "is_configured": true, 00:30:55.533 "data_offset": 256, 00:30:55.533 "data_size": 7936 00:30:55.533 } 00:30:55.533 ] 00:30:55.533 }' 00:30:55.533 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:55.533 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:55.533 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:55.533 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:55.533 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:55.792 [2024-07-26 05:59:10.590257] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:55.792 [2024-07-26 05:59:10.621085] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:55.792 [2024-07-26 05:59:10.621132] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:55.792 [2024-07-26 05:59:10.621147] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:55.792 [2024-07-26 05:59:10.621155] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:55.792 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:55.792 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:55.792 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:55.792 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:55.792 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:55.792 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:55.792 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:55.792 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:55.793 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:55.793 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:55.793 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:55.793 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:56.052 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:56.052 "name": "raid_bdev1", 00:30:56.052 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:30:56.052 "strip_size_kb": 0, 00:30:56.052 "state": "online", 00:30:56.052 "raid_level": "raid1", 00:30:56.052 "superblock": true, 00:30:56.052 "num_base_bdevs": 2, 00:30:56.052 "num_base_bdevs_discovered": 1, 00:30:56.052 "num_base_bdevs_operational": 1, 00:30:56.052 "base_bdevs_list": [ 00:30:56.052 { 00:30:56.052 "name": null, 00:30:56.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:56.052 "is_configured": false, 00:30:56.052 "data_offset": 256, 00:30:56.052 "data_size": 7936 00:30:56.052 }, 00:30:56.052 { 00:30:56.052 "name": "BaseBdev2", 00:30:56.052 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:30:56.052 "is_configured": true, 00:30:56.052 "data_offset": 256, 00:30:56.052 "data_size": 7936 00:30:56.052 } 00:30:56.052 ] 00:30:56.052 }' 00:30:56.052 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:56.052 05:59:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:56.618 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:56.618 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:56.618 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:56.618 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:56.618 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:56.618 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:56.618 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:56.876 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:56.877 "name": "raid_bdev1", 00:30:56.877 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:30:56.877 "strip_size_kb": 0, 00:30:56.877 "state": "online", 00:30:56.877 "raid_level": "raid1", 00:30:56.877 "superblock": true, 00:30:56.877 "num_base_bdevs": 2, 00:30:56.877 "num_base_bdevs_discovered": 1, 00:30:56.877 "num_base_bdevs_operational": 1, 00:30:56.877 "base_bdevs_list": [ 00:30:56.877 { 00:30:56.877 "name": null, 00:30:56.877 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:56.877 "is_configured": false, 00:30:56.877 "data_offset": 256, 00:30:56.877 "data_size": 7936 00:30:56.877 }, 00:30:56.877 { 00:30:56.877 "name": "BaseBdev2", 00:30:56.877 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:30:56.877 "is_configured": true, 00:30:56.877 "data_offset": 256, 00:30:56.877 "data_size": 7936 00:30:56.877 } 00:30:56.877 ] 00:30:56.877 }' 00:30:56.877 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:57.170 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:57.170 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:57.170 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:57.170 05:59:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:57.170 [2024-07-26 05:59:12.064999] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:57.170 [2024-07-26 05:59:12.068607] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23e1df0 00:30:57.170 [2024-07-26 05:59:12.070074] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:57.428 05:59:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:30:58.432 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:58.432 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:58.432 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:58.432 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:58.432 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:58.432 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:58.432 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:58.689 "name": "raid_bdev1", 00:30:58.689 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:30:58.689 "strip_size_kb": 0, 00:30:58.689 "state": "online", 00:30:58.689 "raid_level": "raid1", 00:30:58.689 "superblock": true, 00:30:58.689 "num_base_bdevs": 2, 00:30:58.689 "num_base_bdevs_discovered": 2, 00:30:58.689 "num_base_bdevs_operational": 2, 00:30:58.689 "process": { 00:30:58.689 "type": "rebuild", 00:30:58.689 "target": "spare", 00:30:58.689 "progress": { 00:30:58.689 "blocks": 3072, 00:30:58.689 "percent": 38 00:30:58.689 } 00:30:58.689 }, 00:30:58.689 "base_bdevs_list": [ 00:30:58.689 { 00:30:58.689 "name": "spare", 00:30:58.689 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:30:58.689 "is_configured": true, 00:30:58.689 "data_offset": 256, 00:30:58.689 "data_size": 7936 00:30:58.689 }, 00:30:58.689 { 00:30:58.689 "name": "BaseBdev2", 00:30:58.689 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:30:58.689 "is_configured": true, 00:30:58.689 "data_offset": 256, 00:30:58.689 "data_size": 7936 00:30:58.689 } 00:30:58.689 ] 00:30:58.689 }' 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:30:58.689 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1123 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:58.689 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:58.948 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:58.948 "name": "raid_bdev1", 00:30:58.948 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:30:58.948 "strip_size_kb": 0, 00:30:58.948 "state": "online", 00:30:58.948 "raid_level": "raid1", 00:30:58.948 "superblock": true, 00:30:58.948 "num_base_bdevs": 2, 00:30:58.948 "num_base_bdevs_discovered": 2, 00:30:58.948 "num_base_bdevs_operational": 2, 00:30:58.948 "process": { 00:30:58.948 "type": "rebuild", 00:30:58.948 "target": "spare", 00:30:58.948 "progress": { 00:30:58.948 "blocks": 3840, 00:30:58.948 "percent": 48 00:30:58.948 } 00:30:58.948 }, 00:30:58.948 "base_bdevs_list": [ 00:30:58.948 { 00:30:58.948 "name": "spare", 00:30:58.948 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:30:58.948 "is_configured": true, 00:30:58.948 "data_offset": 256, 00:30:58.948 "data_size": 7936 00:30:58.948 }, 00:30:58.948 { 00:30:58.948 "name": "BaseBdev2", 00:30:58.948 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:30:58.948 "is_configured": true, 00:30:58.948 "data_offset": 256, 00:30:58.948 "data_size": 7936 00:30:58.948 } 00:30:58.948 ] 00:30:58.948 }' 00:30:58.948 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:58.948 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:58.948 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:58.948 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:58.948 05:59:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:59.885 05:59:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:59.885 05:59:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:59.885 05:59:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:59.885 05:59:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:59.885 05:59:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:59.885 05:59:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:59.885 05:59:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:59.885 05:59:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:00.144 05:59:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:00.144 "name": "raid_bdev1", 00:31:00.144 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:00.144 "strip_size_kb": 0, 00:31:00.144 "state": "online", 00:31:00.144 "raid_level": "raid1", 00:31:00.144 "superblock": true, 00:31:00.144 "num_base_bdevs": 2, 00:31:00.144 "num_base_bdevs_discovered": 2, 00:31:00.144 "num_base_bdevs_operational": 2, 00:31:00.144 "process": { 00:31:00.144 "type": "rebuild", 00:31:00.144 "target": "spare", 00:31:00.144 "progress": { 00:31:00.144 "blocks": 7424, 00:31:00.144 "percent": 93 00:31:00.144 } 00:31:00.144 }, 00:31:00.144 "base_bdevs_list": [ 00:31:00.144 { 00:31:00.144 "name": "spare", 00:31:00.144 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:31:00.144 "is_configured": true, 00:31:00.144 "data_offset": 256, 00:31:00.144 "data_size": 7936 00:31:00.144 }, 00:31:00.144 { 00:31:00.144 "name": "BaseBdev2", 00:31:00.144 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:00.144 "is_configured": true, 00:31:00.144 "data_offset": 256, 00:31:00.144 "data_size": 7936 00:31:00.144 } 00:31:00.144 ] 00:31:00.144 }' 00:31:00.144 05:59:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:00.402 05:59:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:00.403 05:59:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:00.403 05:59:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:00.403 05:59:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:31:00.403 [2024-07-26 05:59:15.194280] bdev_raid.c:2870:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:31:00.403 [2024-07-26 05:59:15.194340] bdev_raid.c:2532:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:31:00.403 [2024-07-26 05:59:15.194426] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:01.339 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:31:01.339 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:01.339 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:01.339 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:01.339 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:01.340 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:01.340 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:01.340 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:01.598 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:01.599 "name": "raid_bdev1", 00:31:01.599 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:01.599 "strip_size_kb": 0, 00:31:01.599 "state": "online", 00:31:01.599 "raid_level": "raid1", 00:31:01.599 "superblock": true, 00:31:01.599 "num_base_bdevs": 2, 00:31:01.599 "num_base_bdevs_discovered": 2, 00:31:01.599 "num_base_bdevs_operational": 2, 00:31:01.599 "base_bdevs_list": [ 00:31:01.599 { 00:31:01.599 "name": "spare", 00:31:01.599 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:31:01.599 "is_configured": true, 00:31:01.599 "data_offset": 256, 00:31:01.599 "data_size": 7936 00:31:01.599 }, 00:31:01.599 { 00:31:01.599 "name": "BaseBdev2", 00:31:01.599 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:01.599 "is_configured": true, 00:31:01.599 "data_offset": 256, 00:31:01.599 "data_size": 7936 00:31:01.599 } 00:31:01.599 ] 00:31:01.599 }' 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:01.599 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:01.857 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:01.857 "name": "raid_bdev1", 00:31:01.857 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:01.857 "strip_size_kb": 0, 00:31:01.857 "state": "online", 00:31:01.857 "raid_level": "raid1", 00:31:01.857 "superblock": true, 00:31:01.857 "num_base_bdevs": 2, 00:31:01.857 "num_base_bdevs_discovered": 2, 00:31:01.857 "num_base_bdevs_operational": 2, 00:31:01.857 "base_bdevs_list": [ 00:31:01.857 { 00:31:01.857 "name": "spare", 00:31:01.857 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:31:01.857 "is_configured": true, 00:31:01.857 "data_offset": 256, 00:31:01.857 "data_size": 7936 00:31:01.857 }, 00:31:01.857 { 00:31:01.857 "name": "BaseBdev2", 00:31:01.857 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:01.857 "is_configured": true, 00:31:01.857 "data_offset": 256, 00:31:01.857 "data_size": 7936 00:31:01.857 } 00:31:01.857 ] 00:31:01.857 }' 00:31:01.857 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:01.857 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:01.857 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:02.116 05:59:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:02.376 05:59:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:02.376 "name": "raid_bdev1", 00:31:02.376 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:02.376 "strip_size_kb": 0, 00:31:02.376 "state": "online", 00:31:02.376 "raid_level": "raid1", 00:31:02.376 "superblock": true, 00:31:02.376 "num_base_bdevs": 2, 00:31:02.376 "num_base_bdevs_discovered": 2, 00:31:02.376 "num_base_bdevs_operational": 2, 00:31:02.376 "base_bdevs_list": [ 00:31:02.376 { 00:31:02.376 "name": "spare", 00:31:02.376 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:31:02.376 "is_configured": true, 00:31:02.376 "data_offset": 256, 00:31:02.376 "data_size": 7936 00:31:02.376 }, 00:31:02.376 { 00:31:02.376 "name": "BaseBdev2", 00:31:02.376 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:02.376 "is_configured": true, 00:31:02.376 "data_offset": 256, 00:31:02.376 "data_size": 7936 00:31:02.376 } 00:31:02.376 ] 00:31:02.376 }' 00:31:02.376 05:59:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:02.376 05:59:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:02.944 05:59:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:02.944 [2024-07-26 05:59:17.821813] bdev_raid.c:2382:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:02.944 [2024-07-26 05:59:17.821842] bdev_raid.c:1870:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:02.944 [2024-07-26 05:59:17.821903] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:02.944 [2024-07-26 05:59:17.821960] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:02.944 [2024-07-26 05:59:17.821972] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23df370 name raid_bdev1, state offline 00:31:02.944 05:59:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:02.944 05:59:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:31:03.203 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:31:03.203 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:31:03.203 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:31:03.203 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:03.462 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:31:03.722 [2024-07-26 05:59:18.563734] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:31:03.722 [2024-07-26 05:59:18.563786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:03.722 [2024-07-26 05:59:18.563809] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23dec80 00:31:03.722 [2024-07-26 05:59:18.563821] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:03.722 [2024-07-26 05:59:18.565345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:03.722 [2024-07-26 05:59:18.565374] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:31:03.722 [2024-07-26 05:59:18.565434] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:31:03.722 [2024-07-26 05:59:18.565461] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:03.722 [2024-07-26 05:59:18.565556] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:03.722 spare 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:03.722 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:03.981 [2024-07-26 05:59:18.665867] bdev_raid.c:1720:raid_bdev_configure_cont: *DEBUG*: io device register 0x23e0100 00:31:03.981 [2024-07-26 05:59:18.665888] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:31:03.981 [2024-07-26 05:59:18.665975] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2246690 00:31:03.981 [2024-07-26 05:59:18.666076] bdev_raid.c:1750:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23e0100 00:31:03.981 [2024-07-26 05:59:18.666086] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23e0100 00:31:03.981 [2024-07-26 05:59:18.666156] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:03.981 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:03.981 "name": "raid_bdev1", 00:31:03.981 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:03.981 "strip_size_kb": 0, 00:31:03.981 "state": "online", 00:31:03.981 "raid_level": "raid1", 00:31:03.981 "superblock": true, 00:31:03.981 "num_base_bdevs": 2, 00:31:03.981 "num_base_bdevs_discovered": 2, 00:31:03.981 "num_base_bdevs_operational": 2, 00:31:03.981 "base_bdevs_list": [ 00:31:03.981 { 00:31:03.981 "name": "spare", 00:31:03.981 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:31:03.981 "is_configured": true, 00:31:03.981 "data_offset": 256, 00:31:03.981 "data_size": 7936 00:31:03.981 }, 00:31:03.981 { 00:31:03.981 "name": "BaseBdev2", 00:31:03.981 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:03.981 "is_configured": true, 00:31:03.981 "data_offset": 256, 00:31:03.981 "data_size": 7936 00:31:03.981 } 00:31:03.982 ] 00:31:03.982 }' 00:31:03.982 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:03.982 05:59:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:04.549 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:04.549 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:04.549 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:04.549 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:04.549 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:04.549 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:04.549 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:04.807 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:04.807 "name": "raid_bdev1", 00:31:04.807 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:04.807 "strip_size_kb": 0, 00:31:04.807 "state": "online", 00:31:04.807 "raid_level": "raid1", 00:31:04.807 "superblock": true, 00:31:04.807 "num_base_bdevs": 2, 00:31:04.807 "num_base_bdevs_discovered": 2, 00:31:04.807 "num_base_bdevs_operational": 2, 00:31:04.807 "base_bdevs_list": [ 00:31:04.807 { 00:31:04.807 "name": "spare", 00:31:04.807 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:31:04.807 "is_configured": true, 00:31:04.807 "data_offset": 256, 00:31:04.807 "data_size": 7936 00:31:04.807 }, 00:31:04.807 { 00:31:04.807 "name": "BaseBdev2", 00:31:04.807 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:04.807 "is_configured": true, 00:31:04.807 "data_offset": 256, 00:31:04.807 "data_size": 7936 00:31:04.807 } 00:31:04.807 ] 00:31:04.807 }' 00:31:04.807 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:05.064 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:05.064 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:05.064 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:05.064 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:05.064 05:59:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:31:05.322 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:31:05.322 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:31:05.581 [2024-07-26 05:59:20.252332] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:05.581 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:05.839 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:05.839 "name": "raid_bdev1", 00:31:05.839 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:05.839 "strip_size_kb": 0, 00:31:05.839 "state": "online", 00:31:05.839 "raid_level": "raid1", 00:31:05.839 "superblock": true, 00:31:05.839 "num_base_bdevs": 2, 00:31:05.839 "num_base_bdevs_discovered": 1, 00:31:05.839 "num_base_bdevs_operational": 1, 00:31:05.839 "base_bdevs_list": [ 00:31:05.839 { 00:31:05.839 "name": null, 00:31:05.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:05.839 "is_configured": false, 00:31:05.839 "data_offset": 256, 00:31:05.839 "data_size": 7936 00:31:05.839 }, 00:31:05.839 { 00:31:05.839 "name": "BaseBdev2", 00:31:05.839 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:05.839 "is_configured": true, 00:31:05.839 "data_offset": 256, 00:31:05.839 "data_size": 7936 00:31:05.839 } 00:31:05.839 ] 00:31:05.839 }' 00:31:05.839 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:05.839 05:59:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:06.406 05:59:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:31:06.665 [2024-07-26 05:59:21.331198] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:06.665 [2024-07-26 05:59:21.331371] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:31:06.665 [2024-07-26 05:59:21.331390] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:31:06.665 [2024-07-26 05:59:21.331420] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:06.665 [2024-07-26 05:59:21.335456] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23dff10 00:31:06.665 [2024-07-26 05:59:21.336959] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:06.665 05:59:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:31:07.600 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:07.600 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:07.600 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:07.600 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:07.600 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:07.600 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:07.601 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:07.859 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:07.859 "name": "raid_bdev1", 00:31:07.859 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:07.859 "strip_size_kb": 0, 00:31:07.859 "state": "online", 00:31:07.859 "raid_level": "raid1", 00:31:07.859 "superblock": true, 00:31:07.859 "num_base_bdevs": 2, 00:31:07.859 "num_base_bdevs_discovered": 2, 00:31:07.859 "num_base_bdevs_operational": 2, 00:31:07.859 "process": { 00:31:07.859 "type": "rebuild", 00:31:07.859 "target": "spare", 00:31:07.859 "progress": { 00:31:07.859 "blocks": 3072, 00:31:07.859 "percent": 38 00:31:07.859 } 00:31:07.859 }, 00:31:07.859 "base_bdevs_list": [ 00:31:07.859 { 00:31:07.859 "name": "spare", 00:31:07.859 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:31:07.859 "is_configured": true, 00:31:07.859 "data_offset": 256, 00:31:07.859 "data_size": 7936 00:31:07.859 }, 00:31:07.859 { 00:31:07.859 "name": "BaseBdev2", 00:31:07.859 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:07.859 "is_configured": true, 00:31:07.859 "data_offset": 256, 00:31:07.859 "data_size": 7936 00:31:07.859 } 00:31:07.859 ] 00:31:07.859 }' 00:31:07.859 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:07.859 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:07.859 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:07.859 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:07.859 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:08.118 [2024-07-26 05:59:22.928547] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:08.118 [2024-07-26 05:59:22.949570] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:08.118 [2024-07-26 05:59:22.949613] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:08.118 [2024-07-26 05:59:22.949628] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:08.118 [2024-07-26 05:59:22.949636] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:08.118 05:59:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:08.376 05:59:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:08.376 "name": "raid_bdev1", 00:31:08.376 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:08.376 "strip_size_kb": 0, 00:31:08.376 "state": "online", 00:31:08.376 "raid_level": "raid1", 00:31:08.376 "superblock": true, 00:31:08.376 "num_base_bdevs": 2, 00:31:08.376 "num_base_bdevs_discovered": 1, 00:31:08.376 "num_base_bdevs_operational": 1, 00:31:08.376 "base_bdevs_list": [ 00:31:08.376 { 00:31:08.376 "name": null, 00:31:08.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:08.376 "is_configured": false, 00:31:08.376 "data_offset": 256, 00:31:08.376 "data_size": 7936 00:31:08.376 }, 00:31:08.376 { 00:31:08.376 "name": "BaseBdev2", 00:31:08.376 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:08.376 "is_configured": true, 00:31:08.376 "data_offset": 256, 00:31:08.376 "data_size": 7936 00:31:08.376 } 00:31:08.376 ] 00:31:08.376 }' 00:31:08.376 05:59:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:08.376 05:59:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:08.942 05:59:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:31:09.200 [2024-07-26 05:59:24.040815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:31:09.201 [2024-07-26 05:59:24.040872] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:09.201 [2024-07-26 05:59:24.040896] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23e1200 00:31:09.201 [2024-07-26 05:59:24.040909] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:09.201 [2024-07-26 05:59:24.041116] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:09.201 [2024-07-26 05:59:24.041132] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:31:09.201 [2024-07-26 05:59:24.041189] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:31:09.201 [2024-07-26 05:59:24.041200] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:31:09.201 [2024-07-26 05:59:24.041211] bdev_raid.c:3712:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:31:09.201 [2024-07-26 05:59:24.041231] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:09.201 [2024-07-26 05:59:24.044797] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2246400 00:31:09.201 spare 00:31:09.201 [2024-07-26 05:59:24.046160] bdev_raid.c:2905:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:09.201 05:59:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:10.576 "name": "raid_bdev1", 00:31:10.576 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:10.576 "strip_size_kb": 0, 00:31:10.576 "state": "online", 00:31:10.576 "raid_level": "raid1", 00:31:10.576 "superblock": true, 00:31:10.576 "num_base_bdevs": 2, 00:31:10.576 "num_base_bdevs_discovered": 2, 00:31:10.576 "num_base_bdevs_operational": 2, 00:31:10.576 "process": { 00:31:10.576 "type": "rebuild", 00:31:10.576 "target": "spare", 00:31:10.576 "progress": { 00:31:10.576 "blocks": 2816, 00:31:10.576 "percent": 35 00:31:10.576 } 00:31:10.576 }, 00:31:10.576 "base_bdevs_list": [ 00:31:10.576 { 00:31:10.576 "name": "spare", 00:31:10.576 "uuid": "13472c1c-b0af-5182-8953-1585dc905d8f", 00:31:10.576 "is_configured": true, 00:31:10.576 "data_offset": 256, 00:31:10.576 "data_size": 7936 00:31:10.576 }, 00:31:10.576 { 00:31:10.576 "name": "BaseBdev2", 00:31:10.576 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:10.576 "is_configured": true, 00:31:10.576 "data_offset": 256, 00:31:10.576 "data_size": 7936 00:31:10.576 } 00:31:10.576 ] 00:31:10.576 }' 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:10.576 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:10.577 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:10.835 [2024-07-26 05:59:25.570273] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:10.835 [2024-07-26 05:59:25.658947] bdev_raid.c:2541:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:10.835 [2024-07-26 05:59:25.658993] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:10.835 [2024-07-26 05:59:25.659009] bdev_raid.c:2146:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:10.835 [2024-07-26 05:59:25.659017] bdev_raid.c:2479:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:10.835 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:11.094 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:11.094 "name": "raid_bdev1", 00:31:11.094 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:11.094 "strip_size_kb": 0, 00:31:11.094 "state": "online", 00:31:11.094 "raid_level": "raid1", 00:31:11.094 "superblock": true, 00:31:11.094 "num_base_bdevs": 2, 00:31:11.094 "num_base_bdevs_discovered": 1, 00:31:11.094 "num_base_bdevs_operational": 1, 00:31:11.094 "base_bdevs_list": [ 00:31:11.094 { 00:31:11.094 "name": null, 00:31:11.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:11.094 "is_configured": false, 00:31:11.094 "data_offset": 256, 00:31:11.094 "data_size": 7936 00:31:11.094 }, 00:31:11.094 { 00:31:11.094 "name": "BaseBdev2", 00:31:11.094 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:11.094 "is_configured": true, 00:31:11.094 "data_offset": 256, 00:31:11.094 "data_size": 7936 00:31:11.094 } 00:31:11.094 ] 00:31:11.094 }' 00:31:11.094 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:11.094 05:59:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:11.661 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:11.661 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:11.661 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:11.661 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:11.661 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:11.661 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:11.661 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:11.920 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:11.920 "name": "raid_bdev1", 00:31:11.920 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:11.920 "strip_size_kb": 0, 00:31:11.920 "state": "online", 00:31:11.920 "raid_level": "raid1", 00:31:11.920 "superblock": true, 00:31:11.920 "num_base_bdevs": 2, 00:31:11.920 "num_base_bdevs_discovered": 1, 00:31:11.920 "num_base_bdevs_operational": 1, 00:31:11.920 "base_bdevs_list": [ 00:31:11.920 { 00:31:11.920 "name": null, 00:31:11.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:11.920 "is_configured": false, 00:31:11.920 "data_offset": 256, 00:31:11.920 "data_size": 7936 00:31:11.920 }, 00:31:11.920 { 00:31:11.920 "name": "BaseBdev2", 00:31:11.920 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:11.920 "is_configured": true, 00:31:11.920 "data_offset": 256, 00:31:11.920 "data_size": 7936 00:31:11.920 } 00:31:11.920 ] 00:31:11.920 }' 00:31:11.920 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:11.920 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:11.920 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:12.179 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:12.179 05:59:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:31:12.439 05:59:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:31:12.439 [2024-07-26 05:59:27.319113] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:31:12.439 [2024-07-26 05:59:27.319163] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:12.439 [2024-07-26 05:59:27.319190] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2246fa0 00:31:12.439 [2024-07-26 05:59:27.319203] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:12.439 [2024-07-26 05:59:27.319387] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:12.439 [2024-07-26 05:59:27.319403] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:31:12.439 [2024-07-26 05:59:27.319449] bdev_raid.c:3844:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:31:12.439 [2024-07-26 05:59:27.319460] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:31:12.439 [2024-07-26 05:59:27.319470] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:31:12.439 BaseBdev1 00:31:12.439 05:59:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:31:13.817 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:13.817 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:13.817 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:13.817 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:13.817 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:13.817 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:13.817 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:13.817 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:13.818 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:13.818 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:13.818 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:13.818 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:13.818 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:13.818 "name": "raid_bdev1", 00:31:13.818 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:13.818 "strip_size_kb": 0, 00:31:13.818 "state": "online", 00:31:13.818 "raid_level": "raid1", 00:31:13.818 "superblock": true, 00:31:13.818 "num_base_bdevs": 2, 00:31:13.818 "num_base_bdevs_discovered": 1, 00:31:13.818 "num_base_bdevs_operational": 1, 00:31:13.818 "base_bdevs_list": [ 00:31:13.818 { 00:31:13.818 "name": null, 00:31:13.818 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:13.818 "is_configured": false, 00:31:13.818 "data_offset": 256, 00:31:13.818 "data_size": 7936 00:31:13.818 }, 00:31:13.818 { 00:31:13.818 "name": "BaseBdev2", 00:31:13.818 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:13.818 "is_configured": true, 00:31:13.818 "data_offset": 256, 00:31:13.818 "data_size": 7936 00:31:13.818 } 00:31:13.818 ] 00:31:13.818 }' 00:31:13.818 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:13.818 05:59:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:14.386 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:14.386 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:14.386 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:14.386 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:14.386 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:14.386 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:14.386 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:14.645 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:14.645 "name": "raid_bdev1", 00:31:14.645 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:14.645 "strip_size_kb": 0, 00:31:14.645 "state": "online", 00:31:14.645 "raid_level": "raid1", 00:31:14.645 "superblock": true, 00:31:14.645 "num_base_bdevs": 2, 00:31:14.645 "num_base_bdevs_discovered": 1, 00:31:14.645 "num_base_bdevs_operational": 1, 00:31:14.645 "base_bdevs_list": [ 00:31:14.645 { 00:31:14.645 "name": null, 00:31:14.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:14.645 "is_configured": false, 00:31:14.645 "data_offset": 256, 00:31:14.645 "data_size": 7936 00:31:14.645 }, 00:31:14.645 { 00:31:14.645 "name": "BaseBdev2", 00:31:14.645 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:14.645 "is_configured": true, 00:31:14.645 "data_offset": 256, 00:31:14.645 "data_size": 7936 00:31:14.645 } 00:31:14.645 ] 00:31:14.645 }' 00:31:14.645 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:14.645 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:14.645 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:14.645 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:14.645 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:14.645 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:31:14.646 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:14.905 [2024-07-26 05:59:29.725522] bdev_raid.c:3288:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:14.905 [2024-07-26 05:59:29.725658] bdev_raid.c:3654:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:31:14.905 [2024-07-26 05:59:29.725674] bdev_raid.c:3673:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:31:14.905 request: 00:31:14.905 { 00:31:14.905 "base_bdev": "BaseBdev1", 00:31:14.905 "raid_bdev": "raid_bdev1", 00:31:14.905 "method": "bdev_raid_add_base_bdev", 00:31:14.905 "req_id": 1 00:31:14.905 } 00:31:14.905 Got JSON-RPC error response 00:31:14.905 response: 00:31:14.905 { 00:31:14.905 "code": -22, 00:31:14.905 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:31:14.905 } 00:31:14.905 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:31:14.905 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:31:14.905 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:31:14.905 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:31:14.905 05:59:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:15.878 05:59:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:16.145 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:16.145 "name": "raid_bdev1", 00:31:16.145 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:16.145 "strip_size_kb": 0, 00:31:16.145 "state": "online", 00:31:16.145 "raid_level": "raid1", 00:31:16.145 "superblock": true, 00:31:16.145 "num_base_bdevs": 2, 00:31:16.145 "num_base_bdevs_discovered": 1, 00:31:16.145 "num_base_bdevs_operational": 1, 00:31:16.145 "base_bdevs_list": [ 00:31:16.145 { 00:31:16.145 "name": null, 00:31:16.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:16.145 "is_configured": false, 00:31:16.145 "data_offset": 256, 00:31:16.145 "data_size": 7936 00:31:16.145 }, 00:31:16.145 { 00:31:16.145 "name": "BaseBdev2", 00:31:16.145 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:16.145 "is_configured": true, 00:31:16.145 "data_offset": 256, 00:31:16.145 "data_size": 7936 00:31:16.145 } 00:31:16.145 ] 00:31:16.145 }' 00:31:16.145 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:16.145 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:17.093 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:17.093 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:17.093 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:17.093 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:17.093 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:17.093 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:17.093 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:17.093 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:17.094 "name": "raid_bdev1", 00:31:17.094 "uuid": "35fd5a6d-cb08-4b0c-8da3-51b9166f5c8f", 00:31:17.094 "strip_size_kb": 0, 00:31:17.094 "state": "online", 00:31:17.094 "raid_level": "raid1", 00:31:17.094 "superblock": true, 00:31:17.094 "num_base_bdevs": 2, 00:31:17.094 "num_base_bdevs_discovered": 1, 00:31:17.094 "num_base_bdevs_operational": 1, 00:31:17.094 "base_bdevs_list": [ 00:31:17.094 { 00:31:17.094 "name": null, 00:31:17.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:17.094 "is_configured": false, 00:31:17.094 "data_offset": 256, 00:31:17.094 "data_size": 7936 00:31:17.094 }, 00:31:17.094 { 00:31:17.094 "name": "BaseBdev2", 00:31:17.094 "uuid": "9439ffac-7255-502a-b79c-f086ea74fbf1", 00:31:17.094 "is_configured": true, 00:31:17.094 "data_offset": 256, 00:31:17.094 "data_size": 7936 00:31:17.094 } 00:31:17.094 ] 00:31:17.094 }' 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 1286449 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1286449 ']' 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1286449 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:17.094 05:59:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1286449 00:31:17.353 05:59:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:17.353 05:59:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:17.353 05:59:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1286449' 00:31:17.353 killing process with pid 1286449 00:31:17.353 05:59:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1286449 00:31:17.353 Received shutdown signal, test time was about 60.000000 seconds 00:31:17.353 00:31:17.353 Latency(us) 00:31:17.353 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:17.353 =================================================================================================================== 00:31:17.353 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:31:17.353 [2024-07-26 05:59:32.025339] bdev_raid.c:1373:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:17.353 [2024-07-26 05:59:32.025428] bdev_raid.c: 486:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:17.353 [2024-07-26 05:59:32.025473] bdev_raid.c: 463:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:17.353 [2024-07-26 05:59:32.025484] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23e0100 name raid_bdev1, state offline 00:31:17.353 05:59:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1286449 00:31:17.353 [2024-07-26 05:59:32.052682] bdev_raid.c:1399:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:17.613 05:59:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:31:17.613 00:31:17.613 real 0m28.910s 00:31:17.613 user 0m46.017s 00:31:17.613 sys 0m3.878s 00:31:17.613 05:59:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:17.613 05:59:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:17.613 ************************************ 00:31:17.613 END TEST raid_rebuild_test_sb_md_interleaved 00:31:17.613 ************************************ 00:31:17.613 05:59:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:31:17.613 05:59:32 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:31:17.613 05:59:32 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:31:17.613 05:59:32 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1286449 ']' 00:31:17.613 05:59:32 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1286449 00:31:17.613 05:59:32 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:31:17.613 00:31:17.613 real 18m31.383s 00:31:17.613 user 31m24.104s 00:31:17.613 sys 3m22.673s 00:31:17.613 05:59:32 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:17.613 05:59:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:17.613 ************************************ 00:31:17.613 END TEST bdev_raid 00:31:17.613 ************************************ 00:31:17.613 05:59:32 -- common/autotest_common.sh@1142 -- # return 0 00:31:17.613 05:59:32 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:31:17.613 05:59:32 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:31:17.613 05:59:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:17.613 05:59:32 -- common/autotest_common.sh@10 -- # set +x 00:31:17.613 ************************************ 00:31:17.613 START TEST bdevperf_config 00:31:17.613 ************************************ 00:31:17.613 05:59:32 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:31:17.873 * Looking for test storage... 00:31:17.873 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:17.873 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:17.873 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:17.873 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:17.873 05:59:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:17.874 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:17.874 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:17.874 05:59:32 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:21.168 05:59:35 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-26 05:59:32.663905] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:21.168 [2024-07-26 05:59:32.663975] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290600 ] 00:31:21.168 Using job config with 4 jobs 00:31:21.168 [2024-07-26 05:59:32.804913] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:21.168 [2024-07-26 05:59:32.921706] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:21.168 cpumask for '\''job0'\'' is too big 00:31:21.168 cpumask for '\''job1'\'' is too big 00:31:21.168 cpumask for '\''job2'\'' is too big 00:31:21.168 cpumask for '\''job3'\'' is too big 00:31:21.168 Running I/O for 2 seconds... 00:31:21.168 00:31:21.168 Latency(us) 00:31:21.168 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:21.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.168 Malloc0 : 2.01 23780.55 23.22 0.00 0.00 10755.44 1894.85 16526.47 00:31:21.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.168 Malloc0 : 2.02 23789.95 23.23 0.00 0.00 10726.54 1894.85 14702.86 00:31:21.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.168 Malloc0 : 2.02 23768.12 23.21 0.00 0.00 10712.93 1852.10 12765.27 00:31:21.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.168 Malloc0 : 2.03 23746.36 23.19 0.00 0.00 10697.45 1866.35 11055.64 00:31:21.168 =================================================================================================================== 00:31:21.168 Total : 95084.98 92.86 0.00 0.00 10723.05 1852.10 16526.47' 00:31:21.168 05:59:35 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-26 05:59:32.663905] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:21.168 [2024-07-26 05:59:32.663975] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290600 ] 00:31:21.168 Using job config with 4 jobs 00:31:21.168 [2024-07-26 05:59:32.804913] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:21.168 [2024-07-26 05:59:32.921706] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:21.168 cpumask for '\''job0'\'' is too big 00:31:21.168 cpumask for '\''job1'\'' is too big 00:31:21.168 cpumask for '\''job2'\'' is too big 00:31:21.168 cpumask for '\''job3'\'' is too big 00:31:21.168 Running I/O for 2 seconds... 00:31:21.168 00:31:21.168 Latency(us) 00:31:21.168 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:21.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.168 Malloc0 : 2.01 23780.55 23.22 0.00 0.00 10755.44 1894.85 16526.47 00:31:21.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.168 Malloc0 : 2.02 23789.95 23.23 0.00 0.00 10726.54 1894.85 14702.86 00:31:21.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.168 Malloc0 : 2.02 23768.12 23.21 0.00 0.00 10712.93 1852.10 12765.27 00:31:21.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.168 Malloc0 : 2.03 23746.36 23.19 0.00 0.00 10697.45 1866.35 11055.64 00:31:21.168 =================================================================================================================== 00:31:21.168 Total : 95084.98 92.86 0.00 0.00 10723.05 1852.10 16526.47' 00:31:21.168 05:59:35 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:31:21.168 05:59:35 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:31:21.168 05:59:35 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-26 05:59:32.663905] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:21.169 [2024-07-26 05:59:32.663975] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290600 ] 00:31:21.169 Using job config with 4 jobs 00:31:21.169 [2024-07-26 05:59:32.804913] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:21.169 [2024-07-26 05:59:32.921706] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:21.169 cpumask for '\''job0'\'' is too big 00:31:21.169 cpumask for '\''job1'\'' is too big 00:31:21.169 cpumask for '\''job2'\'' is too big 00:31:21.169 cpumask for '\''job3'\'' is too big 00:31:21.169 Running I/O for 2 seconds... 00:31:21.169 00:31:21.169 Latency(us) 00:31:21.169 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:21.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.169 Malloc0 : 2.01 23780.55 23.22 0.00 0.00 10755.44 1894.85 16526.47 00:31:21.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.169 Malloc0 : 2.02 23789.95 23.23 0.00 0.00 10726.54 1894.85 14702.86 00:31:21.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.169 Malloc0 : 2.02 23768.12 23.21 0.00 0.00 10712.93 1852.10 12765.27 00:31:21.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:21.169 Malloc0 : 2.03 23746.36 23.19 0.00 0.00 10697.45 1866.35 11055.64 00:31:21.169 =================================================================================================================== 00:31:21.169 Total : 95084.98 92.86 0.00 0.00 10723.05 1852.10 16526.47' 00:31:21.169 05:59:35 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:31:21.169 05:59:35 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:21.169 [2024-07-26 05:59:35.467365] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:21.169 [2024-07-26 05:59:35.467502] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290957 ] 00:31:21.169 [2024-07-26 05:59:35.679646] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:21.169 [2024-07-26 05:59:35.796417] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:21.169 cpumask for 'job0' is too big 00:31:21.169 cpumask for 'job1' is too big 00:31:21.169 cpumask for 'job2' is too big 00:31:21.169 cpumask for 'job3' is too big 00:31:23.706 05:59:38 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:31:23.706 Running I/O for 2 seconds... 00:31:23.706 00:31:23.706 Latency(us) 00:31:23.706 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:23.706 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:23.707 Malloc0 : 2.02 23813.66 23.26 0.00 0.00 10740.56 1866.35 16412.49 00:31:23.707 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:23.707 Malloc0 : 2.02 23791.77 23.23 0.00 0.00 10726.02 1866.35 14531.90 00:31:23.707 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:23.707 Malloc0 : 2.02 23770.07 23.21 0.00 0.00 10712.96 1866.35 12708.29 00:31:23.707 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:23.707 Malloc0 : 2.03 23748.41 23.19 0.00 0.00 10697.71 1866.35 10998.65 00:31:23.707 =================================================================================================================== 00:31:23.707 Total : 95123.91 92.89 0.00 0.00 10719.31 1866.35 16412.49' 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:23.707 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:23.707 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:23.707 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:23.707 05:59:38 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:26.242 05:59:40 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-26 05:59:38.321776] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:26.242 [2024-07-26 05:59:38.321844] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291313 ] 00:31:26.242 Using job config with 3 jobs 00:31:26.242 [2024-07-26 05:59:38.459209] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:26.242 [2024-07-26 05:59:38.574908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:26.242 cpumask for '\''job0'\'' is too big 00:31:26.242 cpumask for '\''job1'\'' is too big 00:31:26.242 cpumask for '\''job2'\'' is too big 00:31:26.242 Running I/O for 2 seconds... 00:31:26.242 00:31:26.242 Latency(us) 00:31:26.242 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:26.242 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:26.242 Malloc0 : 2.02 32008.65 31.26 0.00 0.00 7983.07 1837.86 11739.49 00:31:26.242 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:26.242 Malloc0 : 2.02 31979.05 31.23 0.00 0.00 7972.42 1837.86 9858.89 00:31:26.242 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:26.242 Malloc0 : 2.02 31949.60 31.20 0.00 0.00 7962.19 1816.49 8719.14 00:31:26.242 =================================================================================================================== 00:31:26.242 Total : 95937.30 93.69 0.00 0.00 7972.56 1816.49 11739.49' 00:31:26.242 05:59:40 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-26 05:59:38.321776] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:26.242 [2024-07-26 05:59:38.321844] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291313 ] 00:31:26.242 Using job config with 3 jobs 00:31:26.242 [2024-07-26 05:59:38.459209] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:26.242 [2024-07-26 05:59:38.574908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:26.242 cpumask for '\''job0'\'' is too big 00:31:26.242 cpumask for '\''job1'\'' is too big 00:31:26.242 cpumask for '\''job2'\'' is too big 00:31:26.242 Running I/O for 2 seconds... 00:31:26.242 00:31:26.242 Latency(us) 00:31:26.242 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:26.242 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:26.242 Malloc0 : 2.02 32008.65 31.26 0.00 0.00 7983.07 1837.86 11739.49 00:31:26.242 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:26.242 Malloc0 : 2.02 31979.05 31.23 0.00 0.00 7972.42 1837.86 9858.89 00:31:26.242 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:26.242 Malloc0 : 2.02 31949.60 31.20 0.00 0.00 7962.19 1816.49 8719.14 00:31:26.242 =================================================================================================================== 00:31:26.242 Total : 95937.30 93.69 0.00 0.00 7972.56 1816.49 11739.49' 00:31:26.242 05:59:40 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:31:26.242 05:59:40 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-26 05:59:38.321776] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:26.242 [2024-07-26 05:59:38.321844] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291313 ] 00:31:26.242 Using job config with 3 jobs 00:31:26.242 [2024-07-26 05:59:38.459209] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:26.242 [2024-07-26 05:59:38.574908] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:26.242 cpumask for '\''job0'\'' is too big 00:31:26.242 cpumask for '\''job1'\'' is too big 00:31:26.242 cpumask for '\''job2'\'' is too big 00:31:26.242 Running I/O for 2 seconds... 00:31:26.242 00:31:26.242 Latency(us) 00:31:26.242 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:26.242 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:26.243 Malloc0 : 2.02 32008.65 31.26 0.00 0.00 7983.07 1837.86 11739.49 00:31:26.243 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:26.243 Malloc0 : 2.02 31979.05 31.23 0.00 0.00 7972.42 1837.86 9858.89 00:31:26.243 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:26.243 Malloc0 : 2.02 31949.60 31.20 0.00 0.00 7962.19 1816.49 8719.14 00:31:26.243 =================================================================================================================== 00:31:26.243 Total : 95937.30 93.69 0.00 0.00 7972.56 1816.49 11739.49' 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:26.243 00:31:26.243 05:59:40 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:26.243 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:26.243 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:26.243 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:26.243 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:26.243 05:59:41 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:29.534 05:59:43 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-26 05:59:41.083901] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:29.534 [2024-07-26 05:59:41.083968] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291673 ] 00:31:29.534 Using job config with 4 jobs 00:31:29.534 [2024-07-26 05:59:41.226075] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:29.534 [2024-07-26 05:59:41.339230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:29.534 cpumask for '\''job0'\'' is too big 00:31:29.534 cpumask for '\''job1'\'' is too big 00:31:29.534 cpumask for '\''job2'\'' is too big 00:31:29.534 cpumask for '\''job3'\'' is too big 00:31:29.534 Running I/O for 2 seconds... 00:31:29.534 00:31:29.534 Latency(us) 00:31:29.534 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.03 11878.77 11.60 0.00 0.00 21532.44 3846.68 33280.89 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.03 11867.60 11.59 0.00 0.00 21532.17 4673.00 33280.89 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.05 11887.45 11.61 0.00 0.00 21420.99 3789.69 29405.72 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.05 11876.33 11.60 0.00 0.00 21421.46 4616.01 29405.72 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.05 11865.57 11.59 0.00 0.00 21363.07 3761.20 25530.55 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.05 11854.58 11.58 0.00 0.00 21362.37 4644.51 25530.55 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.05 11843.80 11.57 0.00 0.00 21304.72 3761.20 21883.33 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.06 11832.85 11.56 0.00 0.00 21304.32 4616.01 22111.28 00:31:29.534 =================================================================================================================== 00:31:29.534 Total : 94906.96 92.68 0.00 0.00 21404.86 3761.20 33280.89' 00:31:29.534 05:59:43 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-26 05:59:41.083901] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:29.534 [2024-07-26 05:59:41.083968] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291673 ] 00:31:29.534 Using job config with 4 jobs 00:31:29.534 [2024-07-26 05:59:41.226075] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:29.534 [2024-07-26 05:59:41.339230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:29.534 cpumask for '\''job0'\'' is too big 00:31:29.534 cpumask for '\''job1'\'' is too big 00:31:29.534 cpumask for '\''job2'\'' is too big 00:31:29.534 cpumask for '\''job3'\'' is too big 00:31:29.534 Running I/O for 2 seconds... 00:31:29.534 00:31:29.534 Latency(us) 00:31:29.534 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.03 11878.77 11.60 0.00 0.00 21532.44 3846.68 33280.89 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.03 11867.60 11.59 0.00 0.00 21532.17 4673.00 33280.89 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.05 11887.45 11.61 0.00 0.00 21420.99 3789.69 29405.72 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.05 11876.33 11.60 0.00 0.00 21421.46 4616.01 29405.72 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.05 11865.57 11.59 0.00 0.00 21363.07 3761.20 25530.55 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.05 11854.58 11.58 0.00 0.00 21362.37 4644.51 25530.55 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.05 11843.80 11.57 0.00 0.00 21304.72 3761.20 21883.33 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.06 11832.85 11.56 0.00 0.00 21304.32 4616.01 22111.28 00:31:29.534 =================================================================================================================== 00:31:29.534 Total : 94906.96 92.68 0.00 0.00 21404.86 3761.20 33280.89' 00:31:29.534 05:59:43 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-26 05:59:41.083901] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:29.534 [2024-07-26 05:59:41.083968] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291673 ] 00:31:29.534 Using job config with 4 jobs 00:31:29.534 [2024-07-26 05:59:41.226075] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:29.534 [2024-07-26 05:59:41.339230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:29.534 cpumask for '\''job0'\'' is too big 00:31:29.534 cpumask for '\''job1'\'' is too big 00:31:29.534 cpumask for '\''job2'\'' is too big 00:31:29.534 cpumask for '\''job3'\'' is too big 00:31:29.534 Running I/O for 2 seconds... 00:31:29.534 00:31:29.534 Latency(us) 00:31:29.534 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.03 11878.77 11.60 0.00 0.00 21532.44 3846.68 33280.89 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.03 11867.60 11.59 0.00 0.00 21532.17 4673.00 33280.89 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.05 11887.45 11.61 0.00 0.00 21420.99 3789.69 29405.72 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.05 11876.33 11.60 0.00 0.00 21421.46 4616.01 29405.72 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc0 : 2.05 11865.57 11.59 0.00 0.00 21363.07 3761.20 25530.55 00:31:29.534 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.534 Malloc1 : 2.05 11854.58 11.58 0.00 0.00 21362.37 4644.51 25530.55 00:31:29.534 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.535 Malloc0 : 2.05 11843.80 11.57 0.00 0.00 21304.72 3761.20 21883.33 00:31:29.535 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:29.535 Malloc1 : 2.06 11832.85 11.56 0.00 0.00 21304.32 4616.01 22111.28 00:31:29.535 =================================================================================================================== 00:31:29.535 Total : 94906.96 92.68 0.00 0.00 21404.86 3761.20 33280.89' 00:31:29.535 05:59:43 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:31:29.535 05:59:43 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:31:29.535 05:59:43 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:31:29.535 05:59:43 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:31:29.535 05:59:43 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:29.535 05:59:43 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:31:29.535 00:31:29.535 real 0m11.371s 00:31:29.535 user 0m10.004s 00:31:29.535 sys 0m1.227s 00:31:29.535 05:59:43 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:29.535 05:59:43 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:31:29.535 ************************************ 00:31:29.535 END TEST bdevperf_config 00:31:29.535 ************************************ 00:31:29.535 05:59:43 -- common/autotest_common.sh@1142 -- # return 0 00:31:29.535 05:59:43 -- spdk/autotest.sh@192 -- # uname -s 00:31:29.535 05:59:43 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:31:29.535 05:59:43 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:31:29.535 05:59:43 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:31:29.535 05:59:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:29.535 05:59:43 -- common/autotest_common.sh@10 -- # set +x 00:31:29.535 ************************************ 00:31:29.535 START TEST reactor_set_interrupt 00:31:29.535 ************************************ 00:31:29.535 05:59:43 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:31:29.535 * Looking for test storage... 00:31:29.535 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:29.535 05:59:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:31:29.535 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:31:29.535 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:29.535 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:29.535 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:31:29.535 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:29.535 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:31:29.535 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:31:29.535 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:31:29.535 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:31:29.535 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:31:29.535 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:31:29.535 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:31:29.535 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:31:29.535 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:31:29.535 05:59:44 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:31:29.536 05:59:44 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:31:29.536 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:31:29.536 #define SPDK_CONFIG_H 00:31:29.536 #define SPDK_CONFIG_APPS 1 00:31:29.536 #define SPDK_CONFIG_ARCH native 00:31:29.536 #undef SPDK_CONFIG_ASAN 00:31:29.536 #undef SPDK_CONFIG_AVAHI 00:31:29.536 #undef SPDK_CONFIG_CET 00:31:29.536 #define SPDK_CONFIG_COVERAGE 1 00:31:29.536 #define SPDK_CONFIG_CROSS_PREFIX 00:31:29.536 #define SPDK_CONFIG_CRYPTO 1 00:31:29.536 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:31:29.536 #undef SPDK_CONFIG_CUSTOMOCF 00:31:29.536 #undef SPDK_CONFIG_DAOS 00:31:29.536 #define SPDK_CONFIG_DAOS_DIR 00:31:29.536 #define SPDK_CONFIG_DEBUG 1 00:31:29.536 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:31:29.536 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:29.536 #define SPDK_CONFIG_DPDK_INC_DIR 00:31:29.536 #define SPDK_CONFIG_DPDK_LIB_DIR 00:31:29.536 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:31:29.536 #undef SPDK_CONFIG_DPDK_UADK 00:31:29.536 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:29.536 #define SPDK_CONFIG_EXAMPLES 1 00:31:29.536 #undef SPDK_CONFIG_FC 00:31:29.536 #define SPDK_CONFIG_FC_PATH 00:31:29.536 #define SPDK_CONFIG_FIO_PLUGIN 1 00:31:29.536 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:31:29.536 #undef SPDK_CONFIG_FUSE 00:31:29.536 #undef SPDK_CONFIG_FUZZER 00:31:29.536 #define SPDK_CONFIG_FUZZER_LIB 00:31:29.536 #undef SPDK_CONFIG_GOLANG 00:31:29.536 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:31:29.536 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:31:29.536 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:31:29.536 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:31:29.536 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:31:29.536 #undef SPDK_CONFIG_HAVE_LIBBSD 00:31:29.536 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:31:29.536 #define SPDK_CONFIG_IDXD 1 00:31:29.536 #define SPDK_CONFIG_IDXD_KERNEL 1 00:31:29.536 #define SPDK_CONFIG_IPSEC_MB 1 00:31:29.536 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:29.536 #define SPDK_CONFIG_ISAL 1 00:31:29.536 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:31:29.536 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:31:29.536 #define SPDK_CONFIG_LIBDIR 00:31:29.536 #undef SPDK_CONFIG_LTO 00:31:29.536 #define SPDK_CONFIG_MAX_LCORES 128 00:31:29.536 #define SPDK_CONFIG_NVME_CUSE 1 00:31:29.536 #undef SPDK_CONFIG_OCF 00:31:29.536 #define SPDK_CONFIG_OCF_PATH 00:31:29.536 #define SPDK_CONFIG_OPENSSL_PATH 00:31:29.536 #undef SPDK_CONFIG_PGO_CAPTURE 00:31:29.536 #define SPDK_CONFIG_PGO_DIR 00:31:29.536 #undef SPDK_CONFIG_PGO_USE 00:31:29.536 #define SPDK_CONFIG_PREFIX /usr/local 00:31:29.536 #undef SPDK_CONFIG_RAID5F 00:31:29.536 #undef SPDK_CONFIG_RBD 00:31:29.536 #define SPDK_CONFIG_RDMA 1 00:31:29.536 #define SPDK_CONFIG_RDMA_PROV verbs 00:31:29.536 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:31:29.536 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:31:29.536 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:31:29.536 #define SPDK_CONFIG_SHARED 1 00:31:29.536 #undef SPDK_CONFIG_SMA 00:31:29.536 #define SPDK_CONFIG_TESTS 1 00:31:29.536 #undef SPDK_CONFIG_TSAN 00:31:29.536 #define SPDK_CONFIG_UBLK 1 00:31:29.536 #define SPDK_CONFIG_UBSAN 1 00:31:29.536 #undef SPDK_CONFIG_UNIT_TESTS 00:31:29.536 #undef SPDK_CONFIG_URING 00:31:29.536 #define SPDK_CONFIG_URING_PATH 00:31:29.536 #undef SPDK_CONFIG_URING_ZNS 00:31:29.536 #undef SPDK_CONFIG_USDT 00:31:29.536 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:31:29.536 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:31:29.536 #undef SPDK_CONFIG_VFIO_USER 00:31:29.536 #define SPDK_CONFIG_VFIO_USER_DIR 00:31:29.536 #define SPDK_CONFIG_VHOST 1 00:31:29.536 #define SPDK_CONFIG_VIRTIO 1 00:31:29.536 #undef SPDK_CONFIG_VTUNE 00:31:29.536 #define SPDK_CONFIG_VTUNE_DIR 00:31:29.536 #define SPDK_CONFIG_WERROR 1 00:31:29.536 #define SPDK_CONFIG_WPDK_DIR 00:31:29.536 #undef SPDK_CONFIG_XNVME 00:31:29.536 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:31:29.536 05:59:44 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:31:29.536 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:29.536 05:59:44 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:29.536 05:59:44 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:29.536 05:59:44 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:29.536 05:59:44 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:29.536 05:59:44 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:29.536 05:59:44 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:29.536 05:59:44 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:31:29.536 05:59:44 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:29.536 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:31:29.536 05:59:44 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:31:29.537 05:59:44 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:31:29.537 05:59:44 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:31:29.537 05:59:44 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:31:29.537 05:59:44 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:31:29.537 05:59:44 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:31:29.537 05:59:44 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:31:29.537 05:59:44 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:31:29.537 05:59:44 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:29.537 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 1292061 ]] 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 1292061 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.BXvAhK 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.BXvAhK/tests/interrupt /tmp/spdk.BXvAhK 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:31:29.538 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88891727872 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508527616 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5616799744 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249551360 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254261760 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18890014720 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901708800 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=11694080 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253553152 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254265856 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=712704 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:31:29.539 * Looking for test storage... 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88891727872 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7831392256 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:29.539 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1292111 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:29.539 05:59:44 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1292111 /var/tmp/spdk.sock 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1292111 ']' 00:31:29.539 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:29.540 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:29.540 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:29.540 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:29.540 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:29.540 05:59:44 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:29.540 [2024-07-26 05:59:44.238141] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:29.540 [2024-07-26 05:59:44.238199] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1292111 ] 00:31:29.540 [2024-07-26 05:59:44.351564] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:29.798 [2024-07-26 05:59:44.460917] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:29.798 [2024-07-26 05:59:44.464657] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:29.798 [2024-07-26 05:59:44.464664] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:29.798 [2024-07-26 05:59:44.538567] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:30.366 05:59:45 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:30.366 05:59:45 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:31:30.366 05:59:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:31:30.366 05:59:45 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:30.625 Malloc0 00:31:30.625 Malloc1 00:31:30.625 Malloc2 00:31:30.625 05:59:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:31:30.625 05:59:45 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:31:30.625 05:59:45 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:30.625 05:59:45 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:30.625 5000+0 records in 00:31:30.625 5000+0 records out 00:31:30.625 10240000 bytes (10 MB, 9.8 MiB) copied, 0.026013 s, 394 MB/s 00:31:30.625 05:59:45 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:30.884 AIO0 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1292111 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1292111 without_thd 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1292111 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:30.884 05:59:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:31.143 05:59:46 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:31:31.143 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:31:31.143 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:31:31.143 05:59:46 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:31:31.143 05:59:46 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:31.143 05:59:46 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:31:31.143 05:59:46 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:31.143 05:59:46 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:31.143 05:59:46 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:31:31.403 spdk_thread ids are 1 on reactor0. 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1292111 0 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292111 0 idle 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292111 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292111 -w 256 00:31:31.403 05:59:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292111 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.38 reactor_0' 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292111 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.38 reactor_0 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1292111 1 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292111 1 idle 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292111 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292111 -w 256 00:31:31.662 05:59:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292114 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1' 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292114 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1292111 2 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292111 2 idle 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292111 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:31.921 05:59:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292111 -w 256 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292115 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2' 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292115 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:31:31.922 05:59:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:31:32.181 [2024-07-26 05:59:47.037550] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:32.181 05:59:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:31:32.439 [2024-07-26 05:59:47.289247] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:31:32.439 [2024-07-26 05:59:47.289636] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:32.439 05:59:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:31:32.726 [2024-07-26 05:59:47.533153] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:31:32.726 [2024-07-26 05:59:47.533282] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:32.726 05:59:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:32.726 05:59:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1292111 0 00:31:32.726 05:59:47 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1292111 0 busy 00:31:32.727 05:59:47 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292111 00:31:32.727 05:59:47 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:32.727 05:59:47 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:32.727 05:59:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:32.727 05:59:47 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:32.727 05:59:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:32.727 05:59:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:32.727 05:59:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292111 -w 256 00:31:32.727 05:59:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292111 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.81 reactor_0' 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292111 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.81 reactor_0 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1292111 2 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1292111 2 busy 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292111 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:32.985 05:59:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292111 -w 256 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292115 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2' 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292115 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:33.243 05:59:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:31:33.243 [2024-07-26 05:59:48.141153] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:31:33.243 [2024-07-26 05:59:48.141258] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1292111 2 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292111 2 idle 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292111 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292111 -w 256 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292115 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2' 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292115 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:33.502 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:31:33.760 [2024-07-26 05:59:48.505145] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:31:33.760 [2024-07-26 05:59:48.508690] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:33.760 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:31:33.760 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:31:33.760 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:31:34.019 [2024-07-26 05:59:48.753324] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1292111 0 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292111 0 idle 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292111 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292111 -w 256 00:31:34.019 05:59:48 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292111 root 20 0 128.2g 36288 23040 S 6.7 0.0 0:01.61 reactor_0' 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292111 root 20 0 128.2g 36288 23040 S 6.7 0.0 0:01.61 reactor_0 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:31:34.279 05:59:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1292111 00:31:34.279 05:59:48 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1292111 ']' 00:31:34.279 05:59:48 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1292111 00:31:34.279 05:59:48 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:31:34.279 05:59:48 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:34.279 05:59:48 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1292111 00:31:34.279 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:34.279 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:34.279 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1292111' 00:31:34.279 killing process with pid 1292111 00:31:34.279 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1292111 00:31:34.279 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1292111 00:31:34.538 05:59:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:31:34.538 05:59:49 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:34.538 05:59:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:31:34.538 05:59:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:34.538 05:59:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:34.538 05:59:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1292874 00:31:34.538 05:59:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:34.538 05:59:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:34.538 05:59:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1292874 /var/tmp/spdk.sock 00:31:34.538 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1292874 ']' 00:31:34.538 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:34.538 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:34.538 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:34.538 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:34.538 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:34.538 05:59:49 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:34.538 [2024-07-26 05:59:49.311888] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:34.538 [2024-07-26 05:59:49.311963] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1292874 ] 00:31:34.538 [2024-07-26 05:59:49.445456] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:34.797 [2024-07-26 05:59:49.544131] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:34.797 [2024-07-26 05:59:49.544215] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:34.797 [2024-07-26 05:59:49.544219] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:34.797 [2024-07-26 05:59:49.618117] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:35.365 05:59:50 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:35.625 05:59:50 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:31:35.625 05:59:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:31:35.625 05:59:50 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:35.883 Malloc0 00:31:35.883 Malloc1 00:31:35.883 Malloc2 00:31:35.883 05:59:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:31:35.883 05:59:50 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:31:35.883 05:59:50 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:35.883 05:59:50 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:35.883 5000+0 records in 00:31:35.883 5000+0 records out 00:31:35.883 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0259143 s, 395 MB/s 00:31:35.883 05:59:50 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:36.142 AIO0 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1292874 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1292874 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1292874 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:36.142 05:59:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:36.401 05:59:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:31:36.401 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:31:36.401 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:31:36.401 05:59:51 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:31:36.401 05:59:51 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:36.401 05:59:51 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:31:36.401 05:59:51 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:36.401 05:59:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:36.401 05:59:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:31:36.660 spdk_thread ids are 1 on reactor0. 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1292874 0 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292874 0 idle 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292874 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292874 -w 256 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292874 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0' 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292874 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:36.660 05:59:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1292874 1 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292874 1 idle 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292874 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292874 -w 256 00:31:36.661 05:59:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292883 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292883 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1292874 2 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292874 2 idle 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292874 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292874 -w 256 00:31:36.922 05:59:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292884 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292884 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:31:37.239 05:59:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:31:37.512 [2024-07-26 05:59:52.140807] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:31:37.512 [2024-07-26 05:59:52.141009] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:31:37.512 [2024-07-26 05:59:52.141196] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:31:37.512 [2024-07-26 05:59:52.385347] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:31:37.512 [2024-07-26 05:59:52.385475] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1292874 0 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1292874 0 busy 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292874 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:37.512 05:59:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292874 -w 256 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292874 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0' 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292874 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1292874 2 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1292874 2 busy 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292874 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292874 -w 256 00:31:37.778 05:59:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292884 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292884 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:38.037 05:59:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:31:38.296 [2024-07-26 05:59:52.991066] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:31:38.296 [2024-07-26 05:59:52.991177] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1292874 2 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292874 2 idle 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292874 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292874 -w 256 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292884 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292884 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:38.296 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:31:38.555 [2024-07-26 05:59:53.351987] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:31:38.555 [2024-07-26 05:59:53.352153] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:31:38.555 [2024-07-26 05:59:53.352176] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1292874 0 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1292874 0 idle 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1292874 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1292874 -w 256 00:31:38.555 05:59:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1292874 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.60 reactor_0' 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1292874 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.60 reactor_0 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:31:38.814 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1292874 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1292874 ']' 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1292874 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1292874 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1292874' 00:31:38.814 killing process with pid 1292874 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1292874 00:31:38.814 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1292874 00:31:39.073 05:59:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:31:39.073 05:59:53 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:39.073 00:31:39.073 real 0m9.965s 00:31:39.073 user 0m9.304s 00:31:39.073 sys 0m2.082s 00:31:39.073 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:39.073 05:59:53 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:39.073 ************************************ 00:31:39.073 END TEST reactor_set_interrupt 00:31:39.073 ************************************ 00:31:39.073 05:59:53 -- common/autotest_common.sh@1142 -- # return 0 00:31:39.074 05:59:53 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:39.074 05:59:53 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:31:39.074 05:59:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:39.074 05:59:53 -- common/autotest_common.sh@10 -- # set +x 00:31:39.074 ************************************ 00:31:39.074 START TEST reap_unregistered_poller 00:31:39.074 ************************************ 00:31:39.074 05:59:53 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:39.335 * Looking for test storage... 00:31:39.335 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:39.335 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:31:39.335 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:39.335 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:39.335 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:39.335 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:31:39.335 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:39.335 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:31:39.335 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:31:39.335 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:31:39.335 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:31:39.335 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:31:39.335 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:31:39.335 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:31:39.335 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:31:39.335 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:31:39.335 05:59:54 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:31:39.336 05:59:54 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:31:39.336 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:31:39.336 #define SPDK_CONFIG_H 00:31:39.336 #define SPDK_CONFIG_APPS 1 00:31:39.336 #define SPDK_CONFIG_ARCH native 00:31:39.336 #undef SPDK_CONFIG_ASAN 00:31:39.336 #undef SPDK_CONFIG_AVAHI 00:31:39.336 #undef SPDK_CONFIG_CET 00:31:39.336 #define SPDK_CONFIG_COVERAGE 1 00:31:39.336 #define SPDK_CONFIG_CROSS_PREFIX 00:31:39.336 #define SPDK_CONFIG_CRYPTO 1 00:31:39.336 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:31:39.336 #undef SPDK_CONFIG_CUSTOMOCF 00:31:39.336 #undef SPDK_CONFIG_DAOS 00:31:39.336 #define SPDK_CONFIG_DAOS_DIR 00:31:39.336 #define SPDK_CONFIG_DEBUG 1 00:31:39.336 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:31:39.336 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:39.336 #define SPDK_CONFIG_DPDK_INC_DIR 00:31:39.336 #define SPDK_CONFIG_DPDK_LIB_DIR 00:31:39.336 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:31:39.336 #undef SPDK_CONFIG_DPDK_UADK 00:31:39.336 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:39.336 #define SPDK_CONFIG_EXAMPLES 1 00:31:39.336 #undef SPDK_CONFIG_FC 00:31:39.336 #define SPDK_CONFIG_FC_PATH 00:31:39.336 #define SPDK_CONFIG_FIO_PLUGIN 1 00:31:39.336 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:31:39.336 #undef SPDK_CONFIG_FUSE 00:31:39.336 #undef SPDK_CONFIG_FUZZER 00:31:39.336 #define SPDK_CONFIG_FUZZER_LIB 00:31:39.336 #undef SPDK_CONFIG_GOLANG 00:31:39.336 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:31:39.336 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:31:39.336 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:31:39.336 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:31:39.336 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:31:39.336 #undef SPDK_CONFIG_HAVE_LIBBSD 00:31:39.336 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:31:39.336 #define SPDK_CONFIG_IDXD 1 00:31:39.336 #define SPDK_CONFIG_IDXD_KERNEL 1 00:31:39.336 #define SPDK_CONFIG_IPSEC_MB 1 00:31:39.336 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:39.336 #define SPDK_CONFIG_ISAL 1 00:31:39.336 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:31:39.336 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:31:39.336 #define SPDK_CONFIG_LIBDIR 00:31:39.336 #undef SPDK_CONFIG_LTO 00:31:39.336 #define SPDK_CONFIG_MAX_LCORES 128 00:31:39.336 #define SPDK_CONFIG_NVME_CUSE 1 00:31:39.336 #undef SPDK_CONFIG_OCF 00:31:39.336 #define SPDK_CONFIG_OCF_PATH 00:31:39.336 #define SPDK_CONFIG_OPENSSL_PATH 00:31:39.336 #undef SPDK_CONFIG_PGO_CAPTURE 00:31:39.336 #define SPDK_CONFIG_PGO_DIR 00:31:39.336 #undef SPDK_CONFIG_PGO_USE 00:31:39.336 #define SPDK_CONFIG_PREFIX /usr/local 00:31:39.336 #undef SPDK_CONFIG_RAID5F 00:31:39.336 #undef SPDK_CONFIG_RBD 00:31:39.336 #define SPDK_CONFIG_RDMA 1 00:31:39.336 #define SPDK_CONFIG_RDMA_PROV verbs 00:31:39.336 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:31:39.336 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:31:39.336 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:31:39.336 #define SPDK_CONFIG_SHARED 1 00:31:39.336 #undef SPDK_CONFIG_SMA 00:31:39.336 #define SPDK_CONFIG_TESTS 1 00:31:39.336 #undef SPDK_CONFIG_TSAN 00:31:39.336 #define SPDK_CONFIG_UBLK 1 00:31:39.336 #define SPDK_CONFIG_UBSAN 1 00:31:39.336 #undef SPDK_CONFIG_UNIT_TESTS 00:31:39.336 #undef SPDK_CONFIG_URING 00:31:39.336 #define SPDK_CONFIG_URING_PATH 00:31:39.336 #undef SPDK_CONFIG_URING_ZNS 00:31:39.336 #undef SPDK_CONFIG_USDT 00:31:39.336 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:31:39.336 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:31:39.336 #undef SPDK_CONFIG_VFIO_USER 00:31:39.336 #define SPDK_CONFIG_VFIO_USER_DIR 00:31:39.336 #define SPDK_CONFIG_VHOST 1 00:31:39.336 #define SPDK_CONFIG_VIRTIO 1 00:31:39.336 #undef SPDK_CONFIG_VTUNE 00:31:39.336 #define SPDK_CONFIG_VTUNE_DIR 00:31:39.336 #define SPDK_CONFIG_WERROR 1 00:31:39.336 #define SPDK_CONFIG_WPDK_DIR 00:31:39.336 #undef SPDK_CONFIG_XNVME 00:31:39.336 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:31:39.336 05:59:54 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:31:39.336 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:39.336 05:59:54 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:39.336 05:59:54 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:39.336 05:59:54 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:39.336 05:59:54 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:39.336 05:59:54 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:39.336 05:59:54 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:39.336 05:59:54 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:31:39.337 05:59:54 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:31:39.337 05:59:54 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:31:39.337 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 1293517 ]] 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 1293517 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.tvl5pc 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.tvl5pc/tests/interrupt /tmp/spdk.tvl5pc 00:31:39.338 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88891568128 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508527616 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5616959488 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249551360 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254261760 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18890014720 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901708800 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=11694080 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253553152 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254265856 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=712704 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:31:39.339 * Looking for test storage... 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88891568128 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7831552000 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:39.339 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:31:39.339 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1293586 00:31:39.339 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:39.340 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1293586 /var/tmp/spdk.sock 00:31:39.340 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 1293586 ']' 00:31:39.340 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:39.340 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:39.340 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:39.340 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:39.340 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:39.340 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:39.340 05:59:54 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:39.340 [2024-07-26 05:59:54.225201] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:39.340 [2024-07-26 05:59:54.225263] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1293586 ] 00:31:39.599 [2024-07-26 05:59:54.354158] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:39.599 [2024-07-26 05:59:54.466492] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:39.599 [2024-07-26 05:59:54.466578] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:39.599 [2024-07-26 05:59:54.466583] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:39.859 [2024-07-26 05:59:54.547956] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:39.859 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:39.859 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:31:39.859 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:31:39.859 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:31:39.859 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:39.859 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:39.859 05:59:54 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:39.859 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:31:39.859 "name": "app_thread", 00:31:39.859 "id": 1, 00:31:39.859 "active_pollers": [], 00:31:39.859 "timed_pollers": [ 00:31:39.859 { 00:31:39.859 "name": "rpc_subsystem_poll_servers", 00:31:39.859 "id": 1, 00:31:39.859 "state": "waiting", 00:31:39.859 "run_count": 0, 00:31:39.859 "busy_count": 0, 00:31:39.859 "period_ticks": 9200000 00:31:39.859 } 00:31:39.859 ], 00:31:39.859 "paused_pollers": [] 00:31:39.859 }' 00:31:39.859 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:31:40.118 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:31:40.118 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:31:40.118 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:31:40.118 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:31:40.118 05:59:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:31:40.118 05:59:54 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:31:40.118 05:59:54 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:40.118 05:59:54 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:40.118 5000+0 records in 00:31:40.118 5000+0 records out 00:31:40.118 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0257205 s, 398 MB/s 00:31:40.118 05:59:54 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:40.688 AIO0 00:31:40.688 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:40.946 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:31:40.946 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:31:40.947 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:40.947 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:31:40.947 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:40.947 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:40.947 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:31:40.947 "name": "app_thread", 00:31:40.947 "id": 1, 00:31:40.947 "active_pollers": [], 00:31:40.947 "timed_pollers": [ 00:31:40.947 { 00:31:40.947 "name": "rpc_subsystem_poll_servers", 00:31:40.947 "id": 1, 00:31:40.947 "state": "waiting", 00:31:40.947 "run_count": 0, 00:31:40.947 "busy_count": 0, 00:31:40.947 "period_ticks": 9200000 00:31:40.947 } 00:31:40.947 ], 00:31:40.947 "paused_pollers": [] 00:31:40.947 }' 00:31:40.947 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:31:40.947 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:31:40.947 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:31:40.947 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:31:41.205 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:31:41.205 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:31:41.205 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:31:41.205 05:59:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1293586 00:31:41.205 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 1293586 ']' 00:31:41.205 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 1293586 00:31:41.205 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:31:41.205 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:41.205 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1293586 00:31:41.205 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:41.205 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:41.205 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1293586' 00:31:41.205 killing process with pid 1293586 00:31:41.205 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 1293586 00:31:41.206 05:59:55 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 1293586 00:31:41.464 05:59:56 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:31:41.464 05:59:56 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:41.464 00:31:41.464 real 0m2.168s 00:31:41.464 user 0m1.753s 00:31:41.464 sys 0m0.691s 00:31:41.464 05:59:56 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:41.464 05:59:56 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:41.464 ************************************ 00:31:41.464 END TEST reap_unregistered_poller 00:31:41.464 ************************************ 00:31:41.464 05:59:56 -- common/autotest_common.sh@1142 -- # return 0 00:31:41.464 05:59:56 -- spdk/autotest.sh@198 -- # uname -s 00:31:41.464 05:59:56 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:31:41.464 05:59:56 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:31:41.464 05:59:56 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:31:41.464 05:59:56 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@260 -- # timing_exit lib 00:31:41.464 05:59:56 -- common/autotest_common.sh@728 -- # xtrace_disable 00:31:41.464 05:59:56 -- common/autotest_common.sh@10 -- # set +x 00:31:41.464 05:59:56 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:31:41.464 05:59:56 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:31:41.464 05:59:56 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:41.464 05:59:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:41.464 05:59:56 -- common/autotest_common.sh@10 -- # set +x 00:31:41.464 ************************************ 00:31:41.464 START TEST compress_compdev 00:31:41.464 ************************************ 00:31:41.464 05:59:56 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:31:41.464 * Looking for test storage... 00:31:41.724 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:41.724 05:59:56 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:41.724 05:59:56 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:41.725 05:59:56 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:41.725 05:59:56 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:41.725 05:59:56 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:41.725 05:59:56 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:41.725 05:59:56 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:41.725 05:59:56 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:41.725 05:59:56 compress_compdev -- paths/export.sh@5 -- # export PATH 00:31:41.725 05:59:56 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:41.725 05:59:56 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:41.725 05:59:56 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:41.725 05:59:56 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:41.725 05:59:56 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:31:41.725 05:59:56 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:41.725 05:59:56 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:41.725 05:59:56 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1293994 00:31:41.725 05:59:56 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:41.725 05:59:56 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1293994 00:31:41.725 05:59:56 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1293994 ']' 00:31:41.725 05:59:56 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:41.725 05:59:56 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:41.725 05:59:56 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:41.725 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:41.725 05:59:56 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:41.725 05:59:56 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:41.725 05:59:56 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:41.725 [2024-07-26 05:59:56.462208] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:41.725 [2024-07-26 05:59:56.462277] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1293994 ] 00:31:41.725 [2024-07-26 05:59:56.582833] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:41.985 [2024-07-26 05:59:56.686963] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:41.985 [2024-07-26 05:59:56.686969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:42.553 [2024-07-26 05:59:57.432920] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:42.812 05:59:57 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:42.812 05:59:57 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:42.812 05:59:57 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:31:42.812 05:59:57 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:42.812 05:59:57 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:43.380 [2024-07-26 05:59:58.010683] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dfc3c0 PMD being used: compress_qat 00:31:43.380 05:59:58 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:43.380 05:59:58 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:43.380 05:59:58 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:43.380 05:59:58 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:43.380 05:59:58 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:43.380 05:59:58 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:43.380 05:59:58 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:43.380 05:59:58 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:43.640 [ 00:31:43.640 { 00:31:43.640 "name": "Nvme0n1", 00:31:43.640 "aliases": [ 00:31:43.640 "01000000-0000-0000-5cd2-e43197705251" 00:31:43.640 ], 00:31:43.640 "product_name": "NVMe disk", 00:31:43.640 "block_size": 512, 00:31:43.640 "num_blocks": 15002931888, 00:31:43.640 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:43.640 "assigned_rate_limits": { 00:31:43.640 "rw_ios_per_sec": 0, 00:31:43.640 "rw_mbytes_per_sec": 0, 00:31:43.640 "r_mbytes_per_sec": 0, 00:31:43.640 "w_mbytes_per_sec": 0 00:31:43.640 }, 00:31:43.640 "claimed": false, 00:31:43.640 "zoned": false, 00:31:43.640 "supported_io_types": { 00:31:43.640 "read": true, 00:31:43.640 "write": true, 00:31:43.640 "unmap": true, 00:31:43.640 "flush": true, 00:31:43.640 "reset": true, 00:31:43.640 "nvme_admin": true, 00:31:43.640 "nvme_io": true, 00:31:43.640 "nvme_io_md": false, 00:31:43.640 "write_zeroes": true, 00:31:43.640 "zcopy": false, 00:31:43.640 "get_zone_info": false, 00:31:43.640 "zone_management": false, 00:31:43.640 "zone_append": false, 00:31:43.640 "compare": false, 00:31:43.640 "compare_and_write": false, 00:31:43.640 "abort": true, 00:31:43.640 "seek_hole": false, 00:31:43.640 "seek_data": false, 00:31:43.640 "copy": false, 00:31:43.640 "nvme_iov_md": false 00:31:43.640 }, 00:31:43.640 "driver_specific": { 00:31:43.640 "nvme": [ 00:31:43.640 { 00:31:43.640 "pci_address": "0000:5e:00.0", 00:31:43.640 "trid": { 00:31:43.640 "trtype": "PCIe", 00:31:43.640 "traddr": "0000:5e:00.0" 00:31:43.640 }, 00:31:43.640 "ctrlr_data": { 00:31:43.640 "cntlid": 0, 00:31:43.640 "vendor_id": "0x8086", 00:31:43.640 "model_number": "INTEL SSDPF2KX076TZO", 00:31:43.640 "serial_number": "PHAC0301002G7P6CGN", 00:31:43.640 "firmware_revision": "JCV10200", 00:31:43.640 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:43.640 "oacs": { 00:31:43.640 "security": 1, 00:31:43.640 "format": 1, 00:31:43.640 "firmware": 1, 00:31:43.640 "ns_manage": 1 00:31:43.640 }, 00:31:43.640 "multi_ctrlr": false, 00:31:43.640 "ana_reporting": false 00:31:43.640 }, 00:31:43.640 "vs": { 00:31:43.640 "nvme_version": "1.3" 00:31:43.640 }, 00:31:43.640 "ns_data": { 00:31:43.640 "id": 1, 00:31:43.640 "can_share": false 00:31:43.640 }, 00:31:43.640 "security": { 00:31:43.640 "opal": true 00:31:43.640 } 00:31:43.640 } 00:31:43.640 ], 00:31:43.640 "mp_policy": "active_passive" 00:31:43.640 } 00:31:43.640 } 00:31:43.640 ] 00:31:43.640 05:59:58 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:43.640 05:59:58 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:43.899 [2024-07-26 05:59:58.684099] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dfea80 PMD being used: compress_qat 00:31:46.433 b1822b2b-a13c-4d34-a27e-1fe04494d4ed 00:31:46.433 06:00:00 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:46.433 5d3fa040-4b74-4780-8321-87ff6195a4ce 00:31:46.433 06:00:01 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:46.433 06:00:01 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:46.433 06:00:01 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:46.433 06:00:01 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:46.433 06:00:01 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:46.433 06:00:01 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:46.433 06:00:01 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:46.692 06:00:01 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:46.950 [ 00:31:46.950 { 00:31:46.950 "name": "5d3fa040-4b74-4780-8321-87ff6195a4ce", 00:31:46.950 "aliases": [ 00:31:46.950 "lvs0/lv0" 00:31:46.950 ], 00:31:46.950 "product_name": "Logical Volume", 00:31:46.950 "block_size": 512, 00:31:46.950 "num_blocks": 204800, 00:31:46.950 "uuid": "5d3fa040-4b74-4780-8321-87ff6195a4ce", 00:31:46.950 "assigned_rate_limits": { 00:31:46.950 "rw_ios_per_sec": 0, 00:31:46.950 "rw_mbytes_per_sec": 0, 00:31:46.950 "r_mbytes_per_sec": 0, 00:31:46.950 "w_mbytes_per_sec": 0 00:31:46.950 }, 00:31:46.950 "claimed": false, 00:31:46.950 "zoned": false, 00:31:46.950 "supported_io_types": { 00:31:46.950 "read": true, 00:31:46.950 "write": true, 00:31:46.950 "unmap": true, 00:31:46.950 "flush": false, 00:31:46.950 "reset": true, 00:31:46.950 "nvme_admin": false, 00:31:46.950 "nvme_io": false, 00:31:46.950 "nvme_io_md": false, 00:31:46.950 "write_zeroes": true, 00:31:46.950 "zcopy": false, 00:31:46.950 "get_zone_info": false, 00:31:46.950 "zone_management": false, 00:31:46.950 "zone_append": false, 00:31:46.950 "compare": false, 00:31:46.950 "compare_and_write": false, 00:31:46.950 "abort": false, 00:31:46.950 "seek_hole": true, 00:31:46.950 "seek_data": true, 00:31:46.950 "copy": false, 00:31:46.950 "nvme_iov_md": false 00:31:46.950 }, 00:31:46.950 "driver_specific": { 00:31:46.950 "lvol": { 00:31:46.950 "lvol_store_uuid": "b1822b2b-a13c-4d34-a27e-1fe04494d4ed", 00:31:46.950 "base_bdev": "Nvme0n1", 00:31:46.950 "thin_provision": true, 00:31:46.950 "num_allocated_clusters": 0, 00:31:46.950 "snapshot": false, 00:31:46.950 "clone": false, 00:31:46.950 "esnap_clone": false 00:31:46.950 } 00:31:46.950 } 00:31:46.950 } 00:31:46.950 ] 00:31:46.950 06:00:01 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:46.950 06:00:01 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:46.950 06:00:01 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:46.950 [2024-07-26 06:00:01.814380] vbdev_compress.c:1034:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:46.950 COMP_lvs0/lv0 00:31:46.950 06:00:01 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:46.950 06:00:01 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:46.950 06:00:01 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:46.950 06:00:01 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:46.950 06:00:01 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:46.950 06:00:01 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:46.950 06:00:01 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:47.219 06:00:02 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:47.477 [ 00:31:47.477 { 00:31:47.477 "name": "COMP_lvs0/lv0", 00:31:47.477 "aliases": [ 00:31:47.477 "e1e07cb2-8464-57c2-bbc5-cd02be370426" 00:31:47.477 ], 00:31:47.477 "product_name": "compress", 00:31:47.477 "block_size": 512, 00:31:47.477 "num_blocks": 200704, 00:31:47.477 "uuid": "e1e07cb2-8464-57c2-bbc5-cd02be370426", 00:31:47.477 "assigned_rate_limits": { 00:31:47.477 "rw_ios_per_sec": 0, 00:31:47.477 "rw_mbytes_per_sec": 0, 00:31:47.477 "r_mbytes_per_sec": 0, 00:31:47.477 "w_mbytes_per_sec": 0 00:31:47.477 }, 00:31:47.477 "claimed": false, 00:31:47.477 "zoned": false, 00:31:47.477 "supported_io_types": { 00:31:47.477 "read": true, 00:31:47.477 "write": true, 00:31:47.477 "unmap": false, 00:31:47.477 "flush": false, 00:31:47.477 "reset": false, 00:31:47.477 "nvme_admin": false, 00:31:47.477 "nvme_io": false, 00:31:47.477 "nvme_io_md": false, 00:31:47.477 "write_zeroes": true, 00:31:47.477 "zcopy": false, 00:31:47.477 "get_zone_info": false, 00:31:47.477 "zone_management": false, 00:31:47.477 "zone_append": false, 00:31:47.477 "compare": false, 00:31:47.477 "compare_and_write": false, 00:31:47.477 "abort": false, 00:31:47.477 "seek_hole": false, 00:31:47.477 "seek_data": false, 00:31:47.477 "copy": false, 00:31:47.477 "nvme_iov_md": false 00:31:47.477 }, 00:31:47.477 "driver_specific": { 00:31:47.477 "compress": { 00:31:47.477 "name": "COMP_lvs0/lv0", 00:31:47.477 "base_bdev_name": "5d3fa040-4b74-4780-8321-87ff6195a4ce", 00:31:47.477 "pm_path": "/tmp/pmem/87fc986b-5e17-4e7e-bbf8-403a8cbbd8fe" 00:31:47.477 } 00:31:47.477 } 00:31:47.477 } 00:31:47.477 ] 00:31:47.477 06:00:02 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:47.477 06:00:02 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:47.477 [2024-07-26 06:00:02.380708] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fafc41b15c0 PMD being used: compress_qat 00:31:47.477 [2024-07-26 06:00:02.382936] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1df8820 PMD being used: compress_qat 00:31:47.477 Running I/O for 3 seconds... 00:31:50.770 00:31:50.770 Latency(us) 00:31:50.770 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:50.770 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:50.770 Verification LBA range: start 0x0 length 0x3100 00:31:50.770 COMP_lvs0/lv0 : 3.00 5040.50 19.69 0.00 0.00 6297.31 530.70 7579.38 00:31:50.770 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:50.770 Verification LBA range: start 0x3100 length 0x3100 00:31:50.770 COMP_lvs0/lv0 : 3.00 5301.18 20.71 0.00 0.00 6000.37 398.91 7465.41 00:31:50.770 =================================================================================================================== 00:31:50.770 Total : 10341.68 40.40 0.00 0.00 6145.12 398.91 7579.38 00:31:50.770 0 00:31:50.770 06:00:05 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:50.770 06:00:05 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:50.770 06:00:05 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:51.028 06:00:05 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:51.028 06:00:05 compress_compdev -- compress/compress.sh@78 -- # killprocess 1293994 00:31:51.028 06:00:05 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1293994 ']' 00:31:51.028 06:00:05 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1293994 00:31:51.028 06:00:05 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:51.028 06:00:05 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:51.028 06:00:05 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1293994 00:31:51.287 06:00:05 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:51.287 06:00:05 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:51.287 06:00:05 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1293994' 00:31:51.287 killing process with pid 1293994 00:31:51.287 06:00:05 compress_compdev -- common/autotest_common.sh@967 -- # kill 1293994 00:31:51.287 Received shutdown signal, test time was about 3.000000 seconds 00:31:51.287 00:31:51.287 Latency(us) 00:31:51.287 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:51.287 =================================================================================================================== 00:31:51.287 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:51.287 06:00:05 compress_compdev -- common/autotest_common.sh@972 -- # wait 1293994 00:31:53.844 06:00:08 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:53.844 06:00:08 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:53.844 06:00:08 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1296093 00:31:53.844 06:00:08 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:53.844 06:00:08 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:53.844 06:00:08 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1296093 00:31:53.844 06:00:08 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1296093 ']' 00:31:53.844 06:00:08 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:53.844 06:00:08 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:53.844 06:00:08 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:53.844 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:53.844 06:00:08 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:53.844 06:00:08 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:54.103 [2024-07-26 06:00:08.771929] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:31:54.103 [2024-07-26 06:00:08.772006] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1296093 ] 00:31:54.103 [2024-07-26 06:00:08.893488] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:54.103 [2024-07-26 06:00:08.999774] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:54.103 [2024-07-26 06:00:08.999781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:55.039 [2024-07-26 06:00:09.748885] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:55.039 06:00:09 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:55.039 06:00:09 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:55.039 06:00:09 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:31:55.039 06:00:09 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:55.039 06:00:09 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:55.607 [2024-07-26 06:00:10.388776] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xd8a3c0 PMD being used: compress_qat 00:31:55.607 06:00:10 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:55.607 06:00:10 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:55.607 06:00:10 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:55.607 06:00:10 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:55.607 06:00:10 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:55.607 06:00:10 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:55.607 06:00:10 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:55.864 06:00:10 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:56.123 [ 00:31:56.123 { 00:31:56.123 "name": "Nvme0n1", 00:31:56.123 "aliases": [ 00:31:56.123 "01000000-0000-0000-5cd2-e43197705251" 00:31:56.123 ], 00:31:56.123 "product_name": "NVMe disk", 00:31:56.123 "block_size": 512, 00:31:56.123 "num_blocks": 15002931888, 00:31:56.123 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:56.123 "assigned_rate_limits": { 00:31:56.123 "rw_ios_per_sec": 0, 00:31:56.123 "rw_mbytes_per_sec": 0, 00:31:56.123 "r_mbytes_per_sec": 0, 00:31:56.123 "w_mbytes_per_sec": 0 00:31:56.123 }, 00:31:56.123 "claimed": false, 00:31:56.123 "zoned": false, 00:31:56.123 "supported_io_types": { 00:31:56.123 "read": true, 00:31:56.123 "write": true, 00:31:56.123 "unmap": true, 00:31:56.123 "flush": true, 00:31:56.123 "reset": true, 00:31:56.123 "nvme_admin": true, 00:31:56.123 "nvme_io": true, 00:31:56.123 "nvme_io_md": false, 00:31:56.123 "write_zeroes": true, 00:31:56.123 "zcopy": false, 00:31:56.123 "get_zone_info": false, 00:31:56.123 "zone_management": false, 00:31:56.123 "zone_append": false, 00:31:56.123 "compare": false, 00:31:56.123 "compare_and_write": false, 00:31:56.123 "abort": true, 00:31:56.123 "seek_hole": false, 00:31:56.123 "seek_data": false, 00:31:56.123 "copy": false, 00:31:56.123 "nvme_iov_md": false 00:31:56.123 }, 00:31:56.123 "driver_specific": { 00:31:56.123 "nvme": [ 00:31:56.123 { 00:31:56.123 "pci_address": "0000:5e:00.0", 00:31:56.123 "trid": { 00:31:56.123 "trtype": "PCIe", 00:31:56.123 "traddr": "0000:5e:00.0" 00:31:56.123 }, 00:31:56.123 "ctrlr_data": { 00:31:56.123 "cntlid": 0, 00:31:56.123 "vendor_id": "0x8086", 00:31:56.123 "model_number": "INTEL SSDPF2KX076TZO", 00:31:56.123 "serial_number": "PHAC0301002G7P6CGN", 00:31:56.123 "firmware_revision": "JCV10200", 00:31:56.123 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:56.123 "oacs": { 00:31:56.123 "security": 1, 00:31:56.123 "format": 1, 00:31:56.123 "firmware": 1, 00:31:56.123 "ns_manage": 1 00:31:56.123 }, 00:31:56.123 "multi_ctrlr": false, 00:31:56.123 "ana_reporting": false 00:31:56.123 }, 00:31:56.123 "vs": { 00:31:56.123 "nvme_version": "1.3" 00:31:56.123 }, 00:31:56.123 "ns_data": { 00:31:56.123 "id": 1, 00:31:56.123 "can_share": false 00:31:56.123 }, 00:31:56.123 "security": { 00:31:56.123 "opal": true 00:31:56.123 } 00:31:56.123 } 00:31:56.123 ], 00:31:56.123 "mp_policy": "active_passive" 00:31:56.123 } 00:31:56.123 } 00:31:56.123 ] 00:31:56.123 06:00:10 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:56.123 06:00:10 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:56.387 [2024-07-26 06:00:11.066161] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xd8ca80 PMD being used: compress_qat 00:31:58.953 ad06727b-9e49-4077-a927-2f432f6dee9a 00:31:58.953 06:00:13 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:58.953 2ef270fd-f26e-428a-b641-3c4b7f8de44b 00:31:58.953 06:00:13 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:58.953 06:00:13 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:58.953 06:00:13 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:58.953 06:00:13 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:58.953 06:00:13 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:58.953 06:00:13 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:58.953 06:00:13 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:58.953 06:00:13 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:59.212 [ 00:31:59.212 { 00:31:59.212 "name": "2ef270fd-f26e-428a-b641-3c4b7f8de44b", 00:31:59.212 "aliases": [ 00:31:59.212 "lvs0/lv0" 00:31:59.212 ], 00:31:59.212 "product_name": "Logical Volume", 00:31:59.212 "block_size": 512, 00:31:59.212 "num_blocks": 204800, 00:31:59.212 "uuid": "2ef270fd-f26e-428a-b641-3c4b7f8de44b", 00:31:59.212 "assigned_rate_limits": { 00:31:59.212 "rw_ios_per_sec": 0, 00:31:59.212 "rw_mbytes_per_sec": 0, 00:31:59.212 "r_mbytes_per_sec": 0, 00:31:59.212 "w_mbytes_per_sec": 0 00:31:59.212 }, 00:31:59.212 "claimed": false, 00:31:59.212 "zoned": false, 00:31:59.212 "supported_io_types": { 00:31:59.212 "read": true, 00:31:59.212 "write": true, 00:31:59.212 "unmap": true, 00:31:59.212 "flush": false, 00:31:59.212 "reset": true, 00:31:59.212 "nvme_admin": false, 00:31:59.212 "nvme_io": false, 00:31:59.212 "nvme_io_md": false, 00:31:59.212 "write_zeroes": true, 00:31:59.212 "zcopy": false, 00:31:59.212 "get_zone_info": false, 00:31:59.212 "zone_management": false, 00:31:59.212 "zone_append": false, 00:31:59.212 "compare": false, 00:31:59.212 "compare_and_write": false, 00:31:59.212 "abort": false, 00:31:59.212 "seek_hole": true, 00:31:59.212 "seek_data": true, 00:31:59.212 "copy": false, 00:31:59.212 "nvme_iov_md": false 00:31:59.212 }, 00:31:59.212 "driver_specific": { 00:31:59.212 "lvol": { 00:31:59.212 "lvol_store_uuid": "ad06727b-9e49-4077-a927-2f432f6dee9a", 00:31:59.212 "base_bdev": "Nvme0n1", 00:31:59.212 "thin_provision": true, 00:31:59.212 "num_allocated_clusters": 0, 00:31:59.212 "snapshot": false, 00:31:59.212 "clone": false, 00:31:59.212 "esnap_clone": false 00:31:59.212 } 00:31:59.212 } 00:31:59.212 } 00:31:59.212 ] 00:31:59.212 06:00:13 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:59.212 06:00:13 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:59.212 06:00:13 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:59.471 [2024-07-26 06:00:14.200692] vbdev_compress.c:1034:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:59.471 COMP_lvs0/lv0 00:31:59.471 06:00:14 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:59.471 06:00:14 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:59.471 06:00:14 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:59.471 06:00:14 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:59.471 06:00:14 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:59.471 06:00:14 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:59.471 06:00:14 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:59.730 06:00:14 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:59.989 [ 00:31:59.989 { 00:31:59.989 "name": "COMP_lvs0/lv0", 00:31:59.989 "aliases": [ 00:31:59.989 "2ef3a1dc-4dad-5f62-ac01-1ab842137c61" 00:31:59.989 ], 00:31:59.989 "product_name": "compress", 00:31:59.989 "block_size": 512, 00:31:59.989 "num_blocks": 200704, 00:31:59.989 "uuid": "2ef3a1dc-4dad-5f62-ac01-1ab842137c61", 00:31:59.989 "assigned_rate_limits": { 00:31:59.989 "rw_ios_per_sec": 0, 00:31:59.989 "rw_mbytes_per_sec": 0, 00:31:59.989 "r_mbytes_per_sec": 0, 00:31:59.989 "w_mbytes_per_sec": 0 00:31:59.989 }, 00:31:59.989 "claimed": false, 00:31:59.989 "zoned": false, 00:31:59.989 "supported_io_types": { 00:31:59.989 "read": true, 00:31:59.989 "write": true, 00:31:59.989 "unmap": false, 00:31:59.989 "flush": false, 00:31:59.989 "reset": false, 00:31:59.989 "nvme_admin": false, 00:31:59.989 "nvme_io": false, 00:31:59.989 "nvme_io_md": false, 00:31:59.989 "write_zeroes": true, 00:31:59.989 "zcopy": false, 00:31:59.989 "get_zone_info": false, 00:31:59.989 "zone_management": false, 00:31:59.989 "zone_append": false, 00:31:59.989 "compare": false, 00:31:59.989 "compare_and_write": false, 00:31:59.989 "abort": false, 00:31:59.989 "seek_hole": false, 00:31:59.989 "seek_data": false, 00:31:59.989 "copy": false, 00:31:59.989 "nvme_iov_md": false 00:31:59.989 }, 00:31:59.989 "driver_specific": { 00:31:59.989 "compress": { 00:31:59.989 "name": "COMP_lvs0/lv0", 00:31:59.989 "base_bdev_name": "2ef270fd-f26e-428a-b641-3c4b7f8de44b", 00:31:59.989 "pm_path": "/tmp/pmem/adf449f7-9105-43bc-996c-8e75837111c3" 00:31:59.989 } 00:31:59.989 } 00:31:59.989 } 00:31:59.989 ] 00:31:59.989 06:00:14 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:59.990 06:00:14 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:59.990 [2024-07-26 06:00:14.834995] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f21681b15c0 PMD being used: compress_qat 00:31:59.990 [2024-07-26 06:00:14.837237] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xd869b0 PMD being used: compress_qat 00:31:59.990 Running I/O for 3 seconds... 00:32:03.272 00:32:03.273 Latency(us) 00:32:03.273 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:03.273 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:03.273 Verification LBA range: start 0x0 length 0x3100 00:32:03.273 COMP_lvs0/lv0 : 3.00 5147.85 20.11 0.00 0.00 6166.08 523.58 5727.28 00:32:03.273 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:03.273 Verification LBA range: start 0x3100 length 0x3100 00:32:03.273 COMP_lvs0/lv0 : 3.00 5415.81 21.16 0.00 0.00 5873.53 413.16 5499.33 00:32:03.273 =================================================================================================================== 00:32:03.273 Total : 10563.67 41.26 0.00 0.00 6016.09 413.16 5727.28 00:32:03.273 0 00:32:03.273 06:00:17 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:32:03.273 06:00:17 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:03.273 06:00:18 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:03.531 06:00:18 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:03.531 06:00:18 compress_compdev -- compress/compress.sh@78 -- # killprocess 1296093 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1296093 ']' 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1296093 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1296093 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1296093' 00:32:03.531 killing process with pid 1296093 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@967 -- # kill 1296093 00:32:03.531 Received shutdown signal, test time was about 3.000000 seconds 00:32:03.531 00:32:03.531 Latency(us) 00:32:03.531 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:03.531 =================================================================================================================== 00:32:03.531 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:03.531 06:00:18 compress_compdev -- common/autotest_common.sh@972 -- # wait 1296093 00:32:06.820 06:00:21 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:32:06.820 06:00:21 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:32:06.820 06:00:21 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1297699 00:32:06.820 06:00:21 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:06.820 06:00:21 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:32:06.820 06:00:21 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1297699 00:32:06.820 06:00:21 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1297699 ']' 00:32:06.820 06:00:21 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:06.820 06:00:21 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:06.820 06:00:21 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:06.820 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:06.820 06:00:21 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:06.820 06:00:21 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:06.820 [2024-07-26 06:00:21.207418] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:32:06.820 [2024-07-26 06:00:21.207491] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1297699 ] 00:32:06.820 [2024-07-26 06:00:21.326505] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:06.820 [2024-07-26 06:00:21.429901] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:06.820 [2024-07-26 06:00:21.429909] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:07.386 [2024-07-26 06:00:22.187233] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:32:07.386 06:00:22 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:07.386 06:00:22 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:32:07.386 06:00:22 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:32:07.386 06:00:22 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:07.386 06:00:22 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:07.953 [2024-07-26 06:00:22.788596] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x171e3c0 PMD being used: compress_qat 00:32:07.953 06:00:22 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:07.953 06:00:22 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:07.953 06:00:22 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:07.953 06:00:22 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:07.953 06:00:22 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:07.953 06:00:22 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:07.953 06:00:22 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:08.210 06:00:23 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:08.468 [ 00:32:08.468 { 00:32:08.468 "name": "Nvme0n1", 00:32:08.468 "aliases": [ 00:32:08.468 "01000000-0000-0000-5cd2-e43197705251" 00:32:08.468 ], 00:32:08.468 "product_name": "NVMe disk", 00:32:08.468 "block_size": 512, 00:32:08.468 "num_blocks": 15002931888, 00:32:08.468 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:08.468 "assigned_rate_limits": { 00:32:08.468 "rw_ios_per_sec": 0, 00:32:08.468 "rw_mbytes_per_sec": 0, 00:32:08.468 "r_mbytes_per_sec": 0, 00:32:08.468 "w_mbytes_per_sec": 0 00:32:08.468 }, 00:32:08.468 "claimed": false, 00:32:08.468 "zoned": false, 00:32:08.468 "supported_io_types": { 00:32:08.468 "read": true, 00:32:08.468 "write": true, 00:32:08.468 "unmap": true, 00:32:08.468 "flush": true, 00:32:08.468 "reset": true, 00:32:08.468 "nvme_admin": true, 00:32:08.468 "nvme_io": true, 00:32:08.468 "nvme_io_md": false, 00:32:08.468 "write_zeroes": true, 00:32:08.468 "zcopy": false, 00:32:08.468 "get_zone_info": false, 00:32:08.468 "zone_management": false, 00:32:08.468 "zone_append": false, 00:32:08.468 "compare": false, 00:32:08.468 "compare_and_write": false, 00:32:08.468 "abort": true, 00:32:08.468 "seek_hole": false, 00:32:08.468 "seek_data": false, 00:32:08.468 "copy": false, 00:32:08.468 "nvme_iov_md": false 00:32:08.468 }, 00:32:08.468 "driver_specific": { 00:32:08.468 "nvme": [ 00:32:08.468 { 00:32:08.468 "pci_address": "0000:5e:00.0", 00:32:08.468 "trid": { 00:32:08.468 "trtype": "PCIe", 00:32:08.468 "traddr": "0000:5e:00.0" 00:32:08.468 }, 00:32:08.468 "ctrlr_data": { 00:32:08.468 "cntlid": 0, 00:32:08.468 "vendor_id": "0x8086", 00:32:08.468 "model_number": "INTEL SSDPF2KX076TZO", 00:32:08.468 "serial_number": "PHAC0301002G7P6CGN", 00:32:08.468 "firmware_revision": "JCV10200", 00:32:08.468 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:08.468 "oacs": { 00:32:08.468 "security": 1, 00:32:08.468 "format": 1, 00:32:08.468 "firmware": 1, 00:32:08.468 "ns_manage": 1 00:32:08.468 }, 00:32:08.468 "multi_ctrlr": false, 00:32:08.468 "ana_reporting": false 00:32:08.468 }, 00:32:08.468 "vs": { 00:32:08.468 "nvme_version": "1.3" 00:32:08.468 }, 00:32:08.468 "ns_data": { 00:32:08.468 "id": 1, 00:32:08.468 "can_share": false 00:32:08.468 }, 00:32:08.468 "security": { 00:32:08.468 "opal": true 00:32:08.468 } 00:32:08.468 } 00:32:08.468 ], 00:32:08.468 "mp_policy": "active_passive" 00:32:08.468 } 00:32:08.468 } 00:32:08.468 ] 00:32:08.468 06:00:23 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:08.468 06:00:23 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:08.726 [2024-07-26 06:00:23.470070] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1720a80 PMD being used: compress_qat 00:32:11.256 80993cc8-8603-4429-96a2-05a65c11e2db 00:32:11.256 06:00:25 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:11.256 d55fd613-a815-42cf-95df-ab17d3601743 00:32:11.256 06:00:25 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:11.256 06:00:25 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:11.256 06:00:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:11.256 06:00:25 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:11.256 06:00:25 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:11.256 06:00:25 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:11.256 06:00:25 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:11.514 06:00:26 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:11.514 [ 00:32:11.514 { 00:32:11.514 "name": "d55fd613-a815-42cf-95df-ab17d3601743", 00:32:11.514 "aliases": [ 00:32:11.514 "lvs0/lv0" 00:32:11.514 ], 00:32:11.514 "product_name": "Logical Volume", 00:32:11.514 "block_size": 512, 00:32:11.514 "num_blocks": 204800, 00:32:11.514 "uuid": "d55fd613-a815-42cf-95df-ab17d3601743", 00:32:11.514 "assigned_rate_limits": { 00:32:11.514 "rw_ios_per_sec": 0, 00:32:11.514 "rw_mbytes_per_sec": 0, 00:32:11.514 "r_mbytes_per_sec": 0, 00:32:11.514 "w_mbytes_per_sec": 0 00:32:11.514 }, 00:32:11.514 "claimed": false, 00:32:11.514 "zoned": false, 00:32:11.514 "supported_io_types": { 00:32:11.514 "read": true, 00:32:11.514 "write": true, 00:32:11.514 "unmap": true, 00:32:11.514 "flush": false, 00:32:11.514 "reset": true, 00:32:11.514 "nvme_admin": false, 00:32:11.514 "nvme_io": false, 00:32:11.514 "nvme_io_md": false, 00:32:11.514 "write_zeroes": true, 00:32:11.514 "zcopy": false, 00:32:11.514 "get_zone_info": false, 00:32:11.514 "zone_management": false, 00:32:11.514 "zone_append": false, 00:32:11.514 "compare": false, 00:32:11.514 "compare_and_write": false, 00:32:11.514 "abort": false, 00:32:11.514 "seek_hole": true, 00:32:11.514 "seek_data": true, 00:32:11.514 "copy": false, 00:32:11.514 "nvme_iov_md": false 00:32:11.514 }, 00:32:11.514 "driver_specific": { 00:32:11.514 "lvol": { 00:32:11.514 "lvol_store_uuid": "80993cc8-8603-4429-96a2-05a65c11e2db", 00:32:11.514 "base_bdev": "Nvme0n1", 00:32:11.514 "thin_provision": true, 00:32:11.514 "num_allocated_clusters": 0, 00:32:11.514 "snapshot": false, 00:32:11.514 "clone": false, 00:32:11.514 "esnap_clone": false 00:32:11.514 } 00:32:11.514 } 00:32:11.514 } 00:32:11.514 ] 00:32:11.514 06:00:26 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:11.514 06:00:26 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:32:11.514 06:00:26 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:32:11.772 [2024-07-26 06:00:26.600473] vbdev_compress.c:1034:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:11.772 COMP_lvs0/lv0 00:32:11.772 06:00:26 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:11.772 06:00:26 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:11.772 06:00:26 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:11.772 06:00:26 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:11.772 06:00:26 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:11.772 06:00:26 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:11.772 06:00:26 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:12.030 06:00:26 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:12.288 [ 00:32:12.288 { 00:32:12.288 "name": "COMP_lvs0/lv0", 00:32:12.288 "aliases": [ 00:32:12.288 "8597e7e9-e8a9-5ca6-9819-9e93029692a3" 00:32:12.288 ], 00:32:12.288 "product_name": "compress", 00:32:12.288 "block_size": 4096, 00:32:12.288 "num_blocks": 25088, 00:32:12.288 "uuid": "8597e7e9-e8a9-5ca6-9819-9e93029692a3", 00:32:12.288 "assigned_rate_limits": { 00:32:12.288 "rw_ios_per_sec": 0, 00:32:12.288 "rw_mbytes_per_sec": 0, 00:32:12.288 "r_mbytes_per_sec": 0, 00:32:12.288 "w_mbytes_per_sec": 0 00:32:12.288 }, 00:32:12.288 "claimed": false, 00:32:12.288 "zoned": false, 00:32:12.288 "supported_io_types": { 00:32:12.288 "read": true, 00:32:12.288 "write": true, 00:32:12.288 "unmap": false, 00:32:12.288 "flush": false, 00:32:12.288 "reset": false, 00:32:12.288 "nvme_admin": false, 00:32:12.288 "nvme_io": false, 00:32:12.288 "nvme_io_md": false, 00:32:12.288 "write_zeroes": true, 00:32:12.288 "zcopy": false, 00:32:12.288 "get_zone_info": false, 00:32:12.288 "zone_management": false, 00:32:12.288 "zone_append": false, 00:32:12.288 "compare": false, 00:32:12.288 "compare_and_write": false, 00:32:12.288 "abort": false, 00:32:12.288 "seek_hole": false, 00:32:12.288 "seek_data": false, 00:32:12.288 "copy": false, 00:32:12.288 "nvme_iov_md": false 00:32:12.288 }, 00:32:12.288 "driver_specific": { 00:32:12.288 "compress": { 00:32:12.288 "name": "COMP_lvs0/lv0", 00:32:12.288 "base_bdev_name": "d55fd613-a815-42cf-95df-ab17d3601743", 00:32:12.288 "pm_path": "/tmp/pmem/e4086e51-9a10-4fed-8b5c-911b09e1c7d8" 00:32:12.288 } 00:32:12.288 } 00:32:12.288 } 00:32:12.288 ] 00:32:12.288 06:00:27 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:12.288 06:00:27 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:12.288 [2024-07-26 06:00:27.194650] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa2dc1b15c0 PMD being used: compress_qat 00:32:12.546 [2024-07-26 06:00:27.196865] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x171aad0 PMD being used: compress_qat 00:32:12.546 Running I/O for 3 seconds... 00:32:15.830 00:32:15.830 Latency(us) 00:32:15.830 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:15.830 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:15.830 Verification LBA range: start 0x0 length 0x3100 00:32:15.830 COMP_lvs0/lv0 : 3.00 5132.91 20.05 0.00 0.00 6183.99 566.32 5698.78 00:32:15.830 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:15.830 Verification LBA range: start 0x3100 length 0x3100 00:32:15.830 COMP_lvs0/lv0 : 3.00 5374.43 20.99 0.00 0.00 5917.79 365.08 5470.83 00:32:15.830 =================================================================================================================== 00:32:15.830 Total : 10507.34 41.04 0.00 0.00 6047.83 365.08 5698.78 00:32:15.830 0 00:32:15.830 06:00:30 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:32:15.830 06:00:30 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:15.830 06:00:30 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:15.830 06:00:30 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:15.830 06:00:30 compress_compdev -- compress/compress.sh@78 -- # killprocess 1297699 00:32:15.830 06:00:30 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1297699 ']' 00:32:15.830 06:00:30 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1297699 00:32:15.830 06:00:30 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:32:15.830 06:00:30 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:15.831 06:00:30 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1297699 00:32:15.831 06:00:30 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:15.831 06:00:30 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:15.831 06:00:30 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1297699' 00:32:15.831 killing process with pid 1297699 00:32:15.831 06:00:30 compress_compdev -- common/autotest_common.sh@967 -- # kill 1297699 00:32:15.831 Received shutdown signal, test time was about 3.000000 seconds 00:32:15.831 00:32:15.831 Latency(us) 00:32:15.831 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:15.831 =================================================================================================================== 00:32:15.831 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:15.831 06:00:30 compress_compdev -- common/autotest_common.sh@972 -- # wait 1297699 00:32:19.159 06:00:33 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:32:19.159 06:00:33 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:32:19.159 06:00:33 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1299299 00:32:19.159 06:00:33 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:19.159 06:00:33 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:32:19.159 06:00:33 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1299299 00:32:19.159 06:00:33 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1299299 ']' 00:32:19.159 06:00:33 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:19.159 06:00:33 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:19.159 06:00:33 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:19.159 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:19.159 06:00:33 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:19.159 06:00:33 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:19.159 [2024-07-26 06:00:33.495647] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:32:19.159 [2024-07-26 06:00:33.495733] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1299299 ] 00:32:19.159 [2024-07-26 06:00:33.627498] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:19.159 [2024-07-26 06:00:33.731085] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:19.159 [2024-07-26 06:00:33.731169] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:19.159 [2024-07-26 06:00:33.731173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:19.727 [2024-07-26 06:00:34.477355] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:32:19.727 06:00:34 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:19.727 06:00:34 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:32:19.727 06:00:34 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:32:19.727 06:00:34 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:19.727 06:00:34 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:20.294 [2024-07-26 06:00:35.053350] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1afdf20 PMD being used: compress_qat 00:32:20.295 06:00:35 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:20.295 06:00:35 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:20.295 06:00:35 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:20.295 06:00:35 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:20.295 06:00:35 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:20.295 06:00:35 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:20.295 06:00:35 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:20.553 06:00:35 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:20.823 [ 00:32:20.823 { 00:32:20.823 "name": "Nvme0n1", 00:32:20.823 "aliases": [ 00:32:20.823 "01000000-0000-0000-5cd2-e43197705251" 00:32:20.823 ], 00:32:20.823 "product_name": "NVMe disk", 00:32:20.823 "block_size": 512, 00:32:20.823 "num_blocks": 15002931888, 00:32:20.823 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:20.823 "assigned_rate_limits": { 00:32:20.823 "rw_ios_per_sec": 0, 00:32:20.823 "rw_mbytes_per_sec": 0, 00:32:20.823 "r_mbytes_per_sec": 0, 00:32:20.823 "w_mbytes_per_sec": 0 00:32:20.823 }, 00:32:20.823 "claimed": false, 00:32:20.823 "zoned": false, 00:32:20.823 "supported_io_types": { 00:32:20.823 "read": true, 00:32:20.823 "write": true, 00:32:20.823 "unmap": true, 00:32:20.823 "flush": true, 00:32:20.823 "reset": true, 00:32:20.823 "nvme_admin": true, 00:32:20.823 "nvme_io": true, 00:32:20.823 "nvme_io_md": false, 00:32:20.823 "write_zeroes": true, 00:32:20.823 "zcopy": false, 00:32:20.823 "get_zone_info": false, 00:32:20.824 "zone_management": false, 00:32:20.824 "zone_append": false, 00:32:20.824 "compare": false, 00:32:20.824 "compare_and_write": false, 00:32:20.824 "abort": true, 00:32:20.824 "seek_hole": false, 00:32:20.824 "seek_data": false, 00:32:20.824 "copy": false, 00:32:20.824 "nvme_iov_md": false 00:32:20.824 }, 00:32:20.824 "driver_specific": { 00:32:20.824 "nvme": [ 00:32:20.824 { 00:32:20.824 "pci_address": "0000:5e:00.0", 00:32:20.824 "trid": { 00:32:20.824 "trtype": "PCIe", 00:32:20.824 "traddr": "0000:5e:00.0" 00:32:20.824 }, 00:32:20.824 "ctrlr_data": { 00:32:20.824 "cntlid": 0, 00:32:20.824 "vendor_id": "0x8086", 00:32:20.824 "model_number": "INTEL SSDPF2KX076TZO", 00:32:20.824 "serial_number": "PHAC0301002G7P6CGN", 00:32:20.824 "firmware_revision": "JCV10200", 00:32:20.824 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:20.824 "oacs": { 00:32:20.824 "security": 1, 00:32:20.824 "format": 1, 00:32:20.824 "firmware": 1, 00:32:20.824 "ns_manage": 1 00:32:20.824 }, 00:32:20.824 "multi_ctrlr": false, 00:32:20.824 "ana_reporting": false 00:32:20.824 }, 00:32:20.824 "vs": { 00:32:20.824 "nvme_version": "1.3" 00:32:20.824 }, 00:32:20.824 "ns_data": { 00:32:20.824 "id": 1, 00:32:20.824 "can_share": false 00:32:20.824 }, 00:32:20.824 "security": { 00:32:20.824 "opal": true 00:32:20.824 } 00:32:20.824 } 00:32:20.824 ], 00:32:20.824 "mp_policy": "active_passive" 00:32:20.824 } 00:32:20.824 } 00:32:20.824 ] 00:32:20.825 06:00:35 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:20.825 06:00:35 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:21.090 [2024-07-26 06:00:35.798696] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1afdec0 PMD being used: compress_qat 00:32:23.620 4ee0ade1-36c7-477d-98f9-92882455fe7d 00:32:23.620 06:00:38 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:23.620 d61621e9-04d6-4ac8-93c7-8cc5a0650b60 00:32:23.620 06:00:38 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:23.620 06:00:38 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:23.620 06:00:38 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:23.620 06:00:38 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:23.620 06:00:38 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:23.620 06:00:38 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:23.620 06:00:38 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:23.620 06:00:38 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:23.878 [ 00:32:23.878 { 00:32:23.878 "name": "d61621e9-04d6-4ac8-93c7-8cc5a0650b60", 00:32:23.878 "aliases": [ 00:32:23.878 "lvs0/lv0" 00:32:23.878 ], 00:32:23.878 "product_name": "Logical Volume", 00:32:23.878 "block_size": 512, 00:32:23.878 "num_blocks": 204800, 00:32:23.878 "uuid": "d61621e9-04d6-4ac8-93c7-8cc5a0650b60", 00:32:23.878 "assigned_rate_limits": { 00:32:23.878 "rw_ios_per_sec": 0, 00:32:23.878 "rw_mbytes_per_sec": 0, 00:32:23.878 "r_mbytes_per_sec": 0, 00:32:23.878 "w_mbytes_per_sec": 0 00:32:23.878 }, 00:32:23.878 "claimed": false, 00:32:23.878 "zoned": false, 00:32:23.878 "supported_io_types": { 00:32:23.878 "read": true, 00:32:23.878 "write": true, 00:32:23.878 "unmap": true, 00:32:23.878 "flush": false, 00:32:23.878 "reset": true, 00:32:23.878 "nvme_admin": false, 00:32:23.878 "nvme_io": false, 00:32:23.878 "nvme_io_md": false, 00:32:23.878 "write_zeroes": true, 00:32:23.878 "zcopy": false, 00:32:23.878 "get_zone_info": false, 00:32:23.878 "zone_management": false, 00:32:23.878 "zone_append": false, 00:32:23.878 "compare": false, 00:32:23.878 "compare_and_write": false, 00:32:23.878 "abort": false, 00:32:23.878 "seek_hole": true, 00:32:23.878 "seek_data": true, 00:32:23.878 "copy": false, 00:32:23.878 "nvme_iov_md": false 00:32:23.878 }, 00:32:23.878 "driver_specific": { 00:32:23.878 "lvol": { 00:32:23.878 "lvol_store_uuid": "4ee0ade1-36c7-477d-98f9-92882455fe7d", 00:32:23.878 "base_bdev": "Nvme0n1", 00:32:23.878 "thin_provision": true, 00:32:23.878 "num_allocated_clusters": 0, 00:32:23.878 "snapshot": false, 00:32:23.878 "clone": false, 00:32:23.878 "esnap_clone": false 00:32:23.878 } 00:32:23.878 } 00:32:23.878 } 00:32:23.878 ] 00:32:23.878 06:00:38 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:23.878 06:00:38 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:32:23.878 06:00:38 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:32:24.137 [2024-07-26 06:00:38.986421] vbdev_compress.c:1034:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:24.137 COMP_lvs0/lv0 00:32:24.137 06:00:39 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:24.137 06:00:39 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:24.137 06:00:39 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:24.137 06:00:39 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:24.137 06:00:39 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:24.137 06:00:39 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:24.137 06:00:39 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:24.395 06:00:39 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:24.653 [ 00:32:24.653 { 00:32:24.653 "name": "COMP_lvs0/lv0", 00:32:24.653 "aliases": [ 00:32:24.653 "5644eee5-1b9d-5d25-9a9c-1777b710816c" 00:32:24.653 ], 00:32:24.653 "product_name": "compress", 00:32:24.653 "block_size": 512, 00:32:24.653 "num_blocks": 200704, 00:32:24.653 "uuid": "5644eee5-1b9d-5d25-9a9c-1777b710816c", 00:32:24.653 "assigned_rate_limits": { 00:32:24.653 "rw_ios_per_sec": 0, 00:32:24.653 "rw_mbytes_per_sec": 0, 00:32:24.653 "r_mbytes_per_sec": 0, 00:32:24.653 "w_mbytes_per_sec": 0 00:32:24.653 }, 00:32:24.653 "claimed": false, 00:32:24.653 "zoned": false, 00:32:24.653 "supported_io_types": { 00:32:24.653 "read": true, 00:32:24.653 "write": true, 00:32:24.653 "unmap": false, 00:32:24.653 "flush": false, 00:32:24.653 "reset": false, 00:32:24.653 "nvme_admin": false, 00:32:24.653 "nvme_io": false, 00:32:24.653 "nvme_io_md": false, 00:32:24.653 "write_zeroes": true, 00:32:24.653 "zcopy": false, 00:32:24.653 "get_zone_info": false, 00:32:24.653 "zone_management": false, 00:32:24.653 "zone_append": false, 00:32:24.653 "compare": false, 00:32:24.653 "compare_and_write": false, 00:32:24.653 "abort": false, 00:32:24.653 "seek_hole": false, 00:32:24.653 "seek_data": false, 00:32:24.653 "copy": false, 00:32:24.653 "nvme_iov_md": false 00:32:24.653 }, 00:32:24.653 "driver_specific": { 00:32:24.653 "compress": { 00:32:24.653 "name": "COMP_lvs0/lv0", 00:32:24.653 "base_bdev_name": "d61621e9-04d6-4ac8-93c7-8cc5a0650b60", 00:32:24.653 "pm_path": "/tmp/pmem/c5444070-cc45-465d-94b5-4bcfdc36b0b6" 00:32:24.653 } 00:32:24.653 } 00:32:24.653 } 00:32:24.653 ] 00:32:24.653 06:00:39 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:24.653 06:00:39 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:24.911 [2024-07-26 06:00:39.715800] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8cd41b1350 PMD being used: compress_qat 00:32:24.911 I/O targets: 00:32:24.911 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:32:24.911 00:32:24.911 00:32:24.911 CUnit - A unit testing framework for C - Version 2.1-3 00:32:24.911 http://cunit.sourceforge.net/ 00:32:24.911 00:32:24.911 00:32:24.911 Suite: bdevio tests on: COMP_lvs0/lv0 00:32:24.911 Test: blockdev write read block ...passed 00:32:24.911 Test: blockdev write zeroes read block ...passed 00:32:24.911 Test: blockdev write zeroes read no split ...passed 00:32:24.911 Test: blockdev write zeroes read split ...passed 00:32:24.911 Test: blockdev write zeroes read split partial ...passed 00:32:24.911 Test: blockdev reset ...[2024-07-26 06:00:39.753074] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:32:24.911 passed 00:32:24.911 Test: blockdev write read 8 blocks ...passed 00:32:24.911 Test: blockdev write read size > 128k ...passed 00:32:24.911 Test: blockdev write read invalid size ...passed 00:32:24.911 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:24.911 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:24.911 Test: blockdev write read max offset ...passed 00:32:24.911 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:24.911 Test: blockdev writev readv 8 blocks ...passed 00:32:24.911 Test: blockdev writev readv 30 x 1block ...passed 00:32:24.911 Test: blockdev writev readv block ...passed 00:32:24.911 Test: blockdev writev readv size > 128k ...passed 00:32:24.911 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:24.911 Test: blockdev comparev and writev ...passed 00:32:24.911 Test: blockdev nvme passthru rw ...passed 00:32:24.911 Test: blockdev nvme passthru vendor specific ...passed 00:32:24.911 Test: blockdev nvme admin passthru ...passed 00:32:24.911 Test: blockdev copy ...passed 00:32:24.911 00:32:24.911 Run Summary: Type Total Ran Passed Failed Inactive 00:32:24.911 suites 1 1 n/a 0 0 00:32:24.911 tests 23 23 23 0 0 00:32:24.911 asserts 130 130 130 0 n/a 00:32:24.911 00:32:24.911 Elapsed time = 0.091 seconds 00:32:24.911 0 00:32:24.911 06:00:39 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:32:24.911 06:00:39 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:25.169 06:00:40 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:25.429 06:00:40 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:32:25.429 06:00:40 compress_compdev -- compress/compress.sh@62 -- # killprocess 1299299 00:32:25.429 06:00:40 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1299299 ']' 00:32:25.429 06:00:40 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1299299 00:32:25.429 06:00:40 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:32:25.429 06:00:40 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:25.429 06:00:40 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1299299 00:32:25.688 06:00:40 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:25.688 06:00:40 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:25.688 06:00:40 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1299299' 00:32:25.688 killing process with pid 1299299 00:32:25.688 06:00:40 compress_compdev -- common/autotest_common.sh@967 -- # kill 1299299 00:32:25.688 06:00:40 compress_compdev -- common/autotest_common.sh@972 -- # wait 1299299 00:32:28.218 06:00:43 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:32:28.218 06:00:43 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:32:28.218 00:32:28.218 real 0m46.860s 00:32:28.218 user 1m48.165s 00:32:28.218 sys 0m5.726s 00:32:28.218 06:00:43 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:28.218 06:00:43 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:28.218 ************************************ 00:32:28.218 END TEST compress_compdev 00:32:28.218 ************************************ 00:32:28.477 06:00:43 -- common/autotest_common.sh@1142 -- # return 0 00:32:28.477 06:00:43 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:32:28.477 06:00:43 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:28.477 06:00:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:28.477 06:00:43 -- common/autotest_common.sh@10 -- # set +x 00:32:28.477 ************************************ 00:32:28.477 START TEST compress_isal 00:32:28.477 ************************************ 00:32:28.477 06:00:43 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:32:28.477 * Looking for test storage... 00:32:28.477 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:32:28.477 06:00:43 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:32:28.477 06:00:43 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:32:28.477 06:00:43 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:32:28.477 06:00:43 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:32:28.477 06:00:43 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:28.477 06:00:43 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:28.477 06:00:43 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:28.477 06:00:43 compress_isal -- paths/export.sh@5 -- # export PATH 00:32:28.477 06:00:43 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@47 -- # : 0 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:32:28.477 06:00:43 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:32:28.478 06:00:43 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:32:28.478 06:00:43 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:32:28.478 06:00:43 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:32:28.478 06:00:43 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:32:28.478 06:00:43 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:32:28.478 06:00:43 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:32:28.478 06:00:43 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:32:28.478 06:00:43 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:28.478 06:00:43 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1300606 00:32:28.478 06:00:43 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:28.478 06:00:43 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1300606 00:32:28.478 06:00:43 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1300606 ']' 00:32:28.478 06:00:43 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:32:28.478 06:00:43 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:28.478 06:00:43 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:28.478 06:00:43 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:28.478 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:28.478 06:00:43 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:28.478 06:00:43 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:28.736 [2024-07-26 06:00:43.401201] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:32:28.736 [2024-07-26 06:00:43.401277] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1300606 ] 00:32:28.736 [2024-07-26 06:00:43.521780] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:28.736 [2024-07-26 06:00:43.625529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:28.736 [2024-07-26 06:00:43.625534] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:29.672 06:00:44 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:29.672 06:00:44 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:32:29.672 06:00:44 compress_isal -- compress/compress.sh@74 -- # create_vols 00:32:29.672 06:00:44 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:29.672 06:00:44 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:30.239 06:00:44 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:30.239 06:00:44 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:30.239 06:00:44 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:30.239 06:00:44 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:30.239 06:00:44 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:30.239 06:00:44 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:30.239 06:00:44 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:30.505 06:00:45 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:30.763 [ 00:32:30.763 { 00:32:30.763 "name": "Nvme0n1", 00:32:30.763 "aliases": [ 00:32:30.763 "01000000-0000-0000-5cd2-e43197705251" 00:32:30.764 ], 00:32:30.764 "product_name": "NVMe disk", 00:32:30.764 "block_size": 512, 00:32:30.764 "num_blocks": 15002931888, 00:32:30.764 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:30.764 "assigned_rate_limits": { 00:32:30.764 "rw_ios_per_sec": 0, 00:32:30.764 "rw_mbytes_per_sec": 0, 00:32:30.764 "r_mbytes_per_sec": 0, 00:32:30.764 "w_mbytes_per_sec": 0 00:32:30.764 }, 00:32:30.764 "claimed": false, 00:32:30.764 "zoned": false, 00:32:30.764 "supported_io_types": { 00:32:30.764 "read": true, 00:32:30.764 "write": true, 00:32:30.764 "unmap": true, 00:32:30.764 "flush": true, 00:32:30.764 "reset": true, 00:32:30.764 "nvme_admin": true, 00:32:30.764 "nvme_io": true, 00:32:30.764 "nvme_io_md": false, 00:32:30.764 "write_zeroes": true, 00:32:30.764 "zcopy": false, 00:32:30.764 "get_zone_info": false, 00:32:30.764 "zone_management": false, 00:32:30.764 "zone_append": false, 00:32:30.764 "compare": false, 00:32:30.764 "compare_and_write": false, 00:32:30.764 "abort": true, 00:32:30.764 "seek_hole": false, 00:32:30.764 "seek_data": false, 00:32:30.764 "copy": false, 00:32:30.764 "nvme_iov_md": false 00:32:30.764 }, 00:32:30.764 "driver_specific": { 00:32:30.764 "nvme": [ 00:32:30.764 { 00:32:30.764 "pci_address": "0000:5e:00.0", 00:32:30.764 "trid": { 00:32:30.764 "trtype": "PCIe", 00:32:30.764 "traddr": "0000:5e:00.0" 00:32:30.764 }, 00:32:30.764 "ctrlr_data": { 00:32:30.764 "cntlid": 0, 00:32:30.764 "vendor_id": "0x8086", 00:32:30.764 "model_number": "INTEL SSDPF2KX076TZO", 00:32:30.764 "serial_number": "PHAC0301002G7P6CGN", 00:32:30.764 "firmware_revision": "JCV10200", 00:32:30.764 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:30.764 "oacs": { 00:32:30.764 "security": 1, 00:32:30.764 "format": 1, 00:32:30.764 "firmware": 1, 00:32:30.764 "ns_manage": 1 00:32:30.764 }, 00:32:30.764 "multi_ctrlr": false, 00:32:30.764 "ana_reporting": false 00:32:30.764 }, 00:32:30.764 "vs": { 00:32:30.764 "nvme_version": "1.3" 00:32:30.764 }, 00:32:30.764 "ns_data": { 00:32:30.764 "id": 1, 00:32:30.764 "can_share": false 00:32:30.764 }, 00:32:30.764 "security": { 00:32:30.764 "opal": true 00:32:30.764 } 00:32:30.764 } 00:32:30.764 ], 00:32:30.764 "mp_policy": "active_passive" 00:32:30.764 } 00:32:30.764 } 00:32:30.764 ] 00:32:30.764 06:00:45 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:30.764 06:00:45 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:33.296 545b00de-391a-4935-9823-99c6fd2c3670 00:32:33.296 06:00:47 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:33.296 90a461dd-0e2f-4a19-a909-f4f93bdebd13 00:32:33.296 06:00:48 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:33.296 06:00:48 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:33.296 06:00:48 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:33.296 06:00:48 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:33.296 06:00:48 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:33.296 06:00:48 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:33.296 06:00:48 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:33.554 06:00:48 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:33.812 [ 00:32:33.812 { 00:32:33.812 "name": "90a461dd-0e2f-4a19-a909-f4f93bdebd13", 00:32:33.812 "aliases": [ 00:32:33.812 "lvs0/lv0" 00:32:33.812 ], 00:32:33.812 "product_name": "Logical Volume", 00:32:33.812 "block_size": 512, 00:32:33.812 "num_blocks": 204800, 00:32:33.812 "uuid": "90a461dd-0e2f-4a19-a909-f4f93bdebd13", 00:32:33.812 "assigned_rate_limits": { 00:32:33.812 "rw_ios_per_sec": 0, 00:32:33.812 "rw_mbytes_per_sec": 0, 00:32:33.812 "r_mbytes_per_sec": 0, 00:32:33.812 "w_mbytes_per_sec": 0 00:32:33.812 }, 00:32:33.812 "claimed": false, 00:32:33.812 "zoned": false, 00:32:33.812 "supported_io_types": { 00:32:33.812 "read": true, 00:32:33.812 "write": true, 00:32:33.812 "unmap": true, 00:32:33.812 "flush": false, 00:32:33.812 "reset": true, 00:32:33.812 "nvme_admin": false, 00:32:33.812 "nvme_io": false, 00:32:33.812 "nvme_io_md": false, 00:32:33.812 "write_zeroes": true, 00:32:33.812 "zcopy": false, 00:32:33.812 "get_zone_info": false, 00:32:33.812 "zone_management": false, 00:32:33.812 "zone_append": false, 00:32:33.812 "compare": false, 00:32:33.812 "compare_and_write": false, 00:32:33.812 "abort": false, 00:32:33.812 "seek_hole": true, 00:32:33.812 "seek_data": true, 00:32:33.812 "copy": false, 00:32:33.812 "nvme_iov_md": false 00:32:33.812 }, 00:32:33.812 "driver_specific": { 00:32:33.812 "lvol": { 00:32:33.812 "lvol_store_uuid": "545b00de-391a-4935-9823-99c6fd2c3670", 00:32:33.812 "base_bdev": "Nvme0n1", 00:32:33.812 "thin_provision": true, 00:32:33.812 "num_allocated_clusters": 0, 00:32:33.812 "snapshot": false, 00:32:33.812 "clone": false, 00:32:33.813 "esnap_clone": false 00:32:33.813 } 00:32:33.813 } 00:32:33.813 } 00:32:33.813 ] 00:32:33.813 06:00:48 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:33.813 06:00:48 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:32:33.813 06:00:48 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:32:34.071 [2024-07-26 06:00:48.860899] vbdev_compress.c:1034:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:34.071 COMP_lvs0/lv0 00:32:34.071 06:00:48 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:34.071 06:00:48 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:34.071 06:00:48 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:34.071 06:00:48 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:34.071 06:00:48 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:34.071 06:00:48 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:34.071 06:00:48 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:34.342 06:00:49 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:34.608 [ 00:32:34.608 { 00:32:34.608 "name": "COMP_lvs0/lv0", 00:32:34.608 "aliases": [ 00:32:34.608 "84d6654e-fe83-576f-93d4-97c092f22ae8" 00:32:34.608 ], 00:32:34.608 "product_name": "compress", 00:32:34.608 "block_size": 512, 00:32:34.608 "num_blocks": 200704, 00:32:34.608 "uuid": "84d6654e-fe83-576f-93d4-97c092f22ae8", 00:32:34.608 "assigned_rate_limits": { 00:32:34.608 "rw_ios_per_sec": 0, 00:32:34.608 "rw_mbytes_per_sec": 0, 00:32:34.608 "r_mbytes_per_sec": 0, 00:32:34.608 "w_mbytes_per_sec": 0 00:32:34.608 }, 00:32:34.608 "claimed": false, 00:32:34.608 "zoned": false, 00:32:34.608 "supported_io_types": { 00:32:34.608 "read": true, 00:32:34.608 "write": true, 00:32:34.608 "unmap": false, 00:32:34.608 "flush": false, 00:32:34.608 "reset": false, 00:32:34.608 "nvme_admin": false, 00:32:34.608 "nvme_io": false, 00:32:34.608 "nvme_io_md": false, 00:32:34.608 "write_zeroes": true, 00:32:34.608 "zcopy": false, 00:32:34.608 "get_zone_info": false, 00:32:34.608 "zone_management": false, 00:32:34.608 "zone_append": false, 00:32:34.608 "compare": false, 00:32:34.608 "compare_and_write": false, 00:32:34.608 "abort": false, 00:32:34.608 "seek_hole": false, 00:32:34.608 "seek_data": false, 00:32:34.608 "copy": false, 00:32:34.608 "nvme_iov_md": false 00:32:34.608 }, 00:32:34.608 "driver_specific": { 00:32:34.608 "compress": { 00:32:34.608 "name": "COMP_lvs0/lv0", 00:32:34.608 "base_bdev_name": "90a461dd-0e2f-4a19-a909-f4f93bdebd13", 00:32:34.608 "pm_path": "/tmp/pmem/8a5543bf-2fd1-4f63-b08d-a07f70a4556d" 00:32:34.608 } 00:32:34.608 } 00:32:34.608 } 00:32:34.608 ] 00:32:34.608 06:00:49 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:34.608 06:00:49 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:34.608 Running I/O for 3 seconds... 00:32:37.951 00:32:37.951 Latency(us) 00:32:37.951 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:37.951 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:37.951 Verification LBA range: start 0x0 length 0x3100 00:32:37.951 COMP_lvs0/lv0 : 3.00 3923.40 15.33 0.00 0.00 8101.42 669.61 8605.16 00:32:37.951 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:37.951 Verification LBA range: start 0x3100 length 0x3100 00:32:37.951 COMP_lvs0/lv0 : 3.00 3929.55 15.35 0.00 0.00 8101.47 466.59 8149.26 00:32:37.951 =================================================================================================================== 00:32:37.951 Total : 7852.95 30.68 0.00 0.00 8101.44 466.59 8605.16 00:32:37.951 0 00:32:37.951 06:00:52 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:37.951 06:00:52 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:37.951 06:00:52 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:38.208 06:00:53 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:38.208 06:00:53 compress_isal -- compress/compress.sh@78 -- # killprocess 1300606 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1300606 ']' 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1300606 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1300606 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1300606' 00:32:38.208 killing process with pid 1300606 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@967 -- # kill 1300606 00:32:38.208 Received shutdown signal, test time was about 3.000000 seconds 00:32:38.208 00:32:38.208 Latency(us) 00:32:38.208 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:38.208 =================================================================================================================== 00:32:38.208 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:38.208 06:00:53 compress_isal -- common/autotest_common.sh@972 -- # wait 1300606 00:32:41.492 06:00:55 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:32:41.492 06:00:55 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:41.492 06:00:55 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1302216 00:32:41.492 06:00:55 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:41.492 06:00:55 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:32:41.492 06:00:55 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1302216 00:32:41.492 06:00:55 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1302216 ']' 00:32:41.492 06:00:55 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:41.492 06:00:55 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:41.492 06:00:55 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:41.492 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:41.492 06:00:55 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:41.492 06:00:55 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:41.492 [2024-07-26 06:00:55.852362] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:32:41.492 [2024-07-26 06:00:55.852435] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1302216 ] 00:32:41.492 [2024-07-26 06:00:55.970826] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:41.492 [2024-07-26 06:00:56.072675] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:41.492 [2024-07-26 06:00:56.072692] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:42.060 06:00:56 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:42.060 06:00:56 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:32:42.060 06:00:56 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:32:42.060 06:00:56 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:42.060 06:00:56 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:42.626 06:00:57 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:42.626 06:00:57 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:42.626 06:00:57 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:42.626 06:00:57 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:42.626 06:00:57 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:42.626 06:00:57 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:42.626 06:00:57 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:42.886 06:00:57 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:43.145 [ 00:32:43.145 { 00:32:43.145 "name": "Nvme0n1", 00:32:43.145 "aliases": [ 00:32:43.145 "01000000-0000-0000-5cd2-e43197705251" 00:32:43.145 ], 00:32:43.145 "product_name": "NVMe disk", 00:32:43.145 "block_size": 512, 00:32:43.145 "num_blocks": 15002931888, 00:32:43.146 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:43.146 "assigned_rate_limits": { 00:32:43.146 "rw_ios_per_sec": 0, 00:32:43.146 "rw_mbytes_per_sec": 0, 00:32:43.146 "r_mbytes_per_sec": 0, 00:32:43.146 "w_mbytes_per_sec": 0 00:32:43.146 }, 00:32:43.146 "claimed": false, 00:32:43.146 "zoned": false, 00:32:43.146 "supported_io_types": { 00:32:43.146 "read": true, 00:32:43.146 "write": true, 00:32:43.146 "unmap": true, 00:32:43.146 "flush": true, 00:32:43.146 "reset": true, 00:32:43.146 "nvme_admin": true, 00:32:43.146 "nvme_io": true, 00:32:43.146 "nvme_io_md": false, 00:32:43.146 "write_zeroes": true, 00:32:43.146 "zcopy": false, 00:32:43.146 "get_zone_info": false, 00:32:43.146 "zone_management": false, 00:32:43.146 "zone_append": false, 00:32:43.146 "compare": false, 00:32:43.146 "compare_and_write": false, 00:32:43.146 "abort": true, 00:32:43.146 "seek_hole": false, 00:32:43.146 "seek_data": false, 00:32:43.146 "copy": false, 00:32:43.146 "nvme_iov_md": false 00:32:43.146 }, 00:32:43.146 "driver_specific": { 00:32:43.146 "nvme": [ 00:32:43.146 { 00:32:43.146 "pci_address": "0000:5e:00.0", 00:32:43.146 "trid": { 00:32:43.146 "trtype": "PCIe", 00:32:43.146 "traddr": "0000:5e:00.0" 00:32:43.146 }, 00:32:43.146 "ctrlr_data": { 00:32:43.146 "cntlid": 0, 00:32:43.146 "vendor_id": "0x8086", 00:32:43.146 "model_number": "INTEL SSDPF2KX076TZO", 00:32:43.146 "serial_number": "PHAC0301002G7P6CGN", 00:32:43.146 "firmware_revision": "JCV10200", 00:32:43.146 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:43.146 "oacs": { 00:32:43.146 "security": 1, 00:32:43.146 "format": 1, 00:32:43.146 "firmware": 1, 00:32:43.146 "ns_manage": 1 00:32:43.146 }, 00:32:43.146 "multi_ctrlr": false, 00:32:43.146 "ana_reporting": false 00:32:43.146 }, 00:32:43.146 "vs": { 00:32:43.146 "nvme_version": "1.3" 00:32:43.146 }, 00:32:43.146 "ns_data": { 00:32:43.146 "id": 1, 00:32:43.146 "can_share": false 00:32:43.146 }, 00:32:43.146 "security": { 00:32:43.146 "opal": true 00:32:43.146 } 00:32:43.146 } 00:32:43.146 ], 00:32:43.146 "mp_policy": "active_passive" 00:32:43.146 } 00:32:43.146 } 00:32:43.146 ] 00:32:43.146 06:00:57 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:43.146 06:00:57 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:45.680 982493ad-2299-4bc1-8f4a-9f689d17fa60 00:32:45.680 06:01:00 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:45.680 3b7107a0-2771-4f37-9238-df2804470e00 00:32:45.680 06:01:00 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:45.680 06:01:00 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:45.680 06:01:00 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:45.680 06:01:00 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:45.680 06:01:00 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:45.680 06:01:00 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:45.680 06:01:00 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:45.939 06:01:00 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:46.198 [ 00:32:46.198 { 00:32:46.198 "name": "3b7107a0-2771-4f37-9238-df2804470e00", 00:32:46.198 "aliases": [ 00:32:46.198 "lvs0/lv0" 00:32:46.198 ], 00:32:46.198 "product_name": "Logical Volume", 00:32:46.198 "block_size": 512, 00:32:46.198 "num_blocks": 204800, 00:32:46.198 "uuid": "3b7107a0-2771-4f37-9238-df2804470e00", 00:32:46.198 "assigned_rate_limits": { 00:32:46.198 "rw_ios_per_sec": 0, 00:32:46.198 "rw_mbytes_per_sec": 0, 00:32:46.198 "r_mbytes_per_sec": 0, 00:32:46.198 "w_mbytes_per_sec": 0 00:32:46.198 }, 00:32:46.198 "claimed": false, 00:32:46.198 "zoned": false, 00:32:46.198 "supported_io_types": { 00:32:46.198 "read": true, 00:32:46.198 "write": true, 00:32:46.198 "unmap": true, 00:32:46.198 "flush": false, 00:32:46.198 "reset": true, 00:32:46.198 "nvme_admin": false, 00:32:46.198 "nvme_io": false, 00:32:46.198 "nvme_io_md": false, 00:32:46.198 "write_zeroes": true, 00:32:46.198 "zcopy": false, 00:32:46.198 "get_zone_info": false, 00:32:46.198 "zone_management": false, 00:32:46.198 "zone_append": false, 00:32:46.198 "compare": false, 00:32:46.198 "compare_and_write": false, 00:32:46.198 "abort": false, 00:32:46.198 "seek_hole": true, 00:32:46.198 "seek_data": true, 00:32:46.198 "copy": false, 00:32:46.198 "nvme_iov_md": false 00:32:46.198 }, 00:32:46.198 "driver_specific": { 00:32:46.198 "lvol": { 00:32:46.198 "lvol_store_uuid": "982493ad-2299-4bc1-8f4a-9f689d17fa60", 00:32:46.198 "base_bdev": "Nvme0n1", 00:32:46.198 "thin_provision": true, 00:32:46.198 "num_allocated_clusters": 0, 00:32:46.198 "snapshot": false, 00:32:46.198 "clone": false, 00:32:46.198 "esnap_clone": false 00:32:46.198 } 00:32:46.198 } 00:32:46.198 } 00:32:46.198 ] 00:32:46.198 06:01:01 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:46.198 06:01:01 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:32:46.198 06:01:01 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:32:46.457 [2024-07-26 06:01:01.168385] vbdev_compress.c:1034:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:46.457 COMP_lvs0/lv0 00:32:46.457 06:01:01 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:46.457 06:01:01 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:46.457 06:01:01 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:46.457 06:01:01 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:46.457 06:01:01 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:46.457 06:01:01 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:46.457 06:01:01 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:46.716 06:01:01 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:46.974 [ 00:32:46.974 { 00:32:46.974 "name": "COMP_lvs0/lv0", 00:32:46.974 "aliases": [ 00:32:46.974 "b6e49ddb-be28-5515-bc45-025ac0bb40f1" 00:32:46.974 ], 00:32:46.974 "product_name": "compress", 00:32:46.974 "block_size": 512, 00:32:46.974 "num_blocks": 200704, 00:32:46.974 "uuid": "b6e49ddb-be28-5515-bc45-025ac0bb40f1", 00:32:46.974 "assigned_rate_limits": { 00:32:46.974 "rw_ios_per_sec": 0, 00:32:46.974 "rw_mbytes_per_sec": 0, 00:32:46.974 "r_mbytes_per_sec": 0, 00:32:46.974 "w_mbytes_per_sec": 0 00:32:46.974 }, 00:32:46.974 "claimed": false, 00:32:46.974 "zoned": false, 00:32:46.974 "supported_io_types": { 00:32:46.974 "read": true, 00:32:46.974 "write": true, 00:32:46.974 "unmap": false, 00:32:46.974 "flush": false, 00:32:46.974 "reset": false, 00:32:46.974 "nvme_admin": false, 00:32:46.974 "nvme_io": false, 00:32:46.974 "nvme_io_md": false, 00:32:46.974 "write_zeroes": true, 00:32:46.974 "zcopy": false, 00:32:46.974 "get_zone_info": false, 00:32:46.974 "zone_management": false, 00:32:46.974 "zone_append": false, 00:32:46.974 "compare": false, 00:32:46.974 "compare_and_write": false, 00:32:46.974 "abort": false, 00:32:46.974 "seek_hole": false, 00:32:46.974 "seek_data": false, 00:32:46.974 "copy": false, 00:32:46.974 "nvme_iov_md": false 00:32:46.974 }, 00:32:46.974 "driver_specific": { 00:32:46.974 "compress": { 00:32:46.974 "name": "COMP_lvs0/lv0", 00:32:46.975 "base_bdev_name": "3b7107a0-2771-4f37-9238-df2804470e00", 00:32:46.975 "pm_path": "/tmp/pmem/f41692d7-8f99-4229-b1f6-0ac5ef803794" 00:32:46.975 } 00:32:46.975 } 00:32:46.975 } 00:32:46.975 ] 00:32:46.975 06:01:01 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:46.975 06:01:01 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:46.975 Running I/O for 3 seconds... 00:32:50.261 00:32:50.261 Latency(us) 00:32:50.261 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:50.261 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:50.261 Verification LBA range: start 0x0 length 0x3100 00:32:50.261 COMP_lvs0/lv0 : 3.00 3962.36 15.48 0.00 0.00 8020.36 637.55 7009.50 00:32:50.261 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:50.261 Verification LBA range: start 0x3100 length 0x3100 00:32:50.261 COMP_lvs0/lv0 : 3.00 3966.27 15.49 0.00 0.00 8026.51 495.08 7009.50 00:32:50.261 =================================================================================================================== 00:32:50.261 Total : 7928.63 30.97 0.00 0.00 8023.44 495.08 7009.50 00:32:50.261 0 00:32:50.261 06:01:04 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:50.261 06:01:04 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:50.261 06:01:05 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:50.520 06:01:05 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:50.520 06:01:05 compress_isal -- compress/compress.sh@78 -- # killprocess 1302216 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1302216 ']' 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1302216 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1302216 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1302216' 00:32:50.520 killing process with pid 1302216 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@967 -- # kill 1302216 00:32:50.520 Received shutdown signal, test time was about 3.000000 seconds 00:32:50.520 00:32:50.520 Latency(us) 00:32:50.520 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:50.520 =================================================================================================================== 00:32:50.520 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:50.520 06:01:05 compress_isal -- common/autotest_common.sh@972 -- # wait 1302216 00:32:53.807 06:01:08 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:32:53.807 06:01:08 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:53.807 06:01:08 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1303810 00:32:53.807 06:01:08 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:53.807 06:01:08 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:32:53.807 06:01:08 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1303810 00:32:53.807 06:01:08 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1303810 ']' 00:32:53.807 06:01:08 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:53.807 06:01:08 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:53.807 06:01:08 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:53.807 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:53.807 06:01:08 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:53.807 06:01:08 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:53.808 [2024-07-26 06:01:08.164222] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:32:53.808 [2024-07-26 06:01:08.164302] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1303810 ] 00:32:53.808 [2024-07-26 06:01:08.285820] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:53.808 [2024-07-26 06:01:08.383879] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:53.808 [2024-07-26 06:01:08.383886] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:54.375 06:01:09 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:54.375 06:01:09 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:32:54.375 06:01:09 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:32:54.375 06:01:09 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:54.375 06:01:09 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:54.942 06:01:09 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:54.942 06:01:09 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:54.942 06:01:09 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:54.942 06:01:09 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:54.942 06:01:09 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:54.942 06:01:09 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:54.942 06:01:09 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:54.942 06:01:09 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:55.200 [ 00:32:55.200 { 00:32:55.200 "name": "Nvme0n1", 00:32:55.200 "aliases": [ 00:32:55.200 "01000000-0000-0000-5cd2-e43197705251" 00:32:55.200 ], 00:32:55.200 "product_name": "NVMe disk", 00:32:55.200 "block_size": 512, 00:32:55.200 "num_blocks": 15002931888, 00:32:55.200 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:55.200 "assigned_rate_limits": { 00:32:55.200 "rw_ios_per_sec": 0, 00:32:55.200 "rw_mbytes_per_sec": 0, 00:32:55.200 "r_mbytes_per_sec": 0, 00:32:55.200 "w_mbytes_per_sec": 0 00:32:55.200 }, 00:32:55.201 "claimed": false, 00:32:55.201 "zoned": false, 00:32:55.201 "supported_io_types": { 00:32:55.201 "read": true, 00:32:55.201 "write": true, 00:32:55.201 "unmap": true, 00:32:55.201 "flush": true, 00:32:55.201 "reset": true, 00:32:55.201 "nvme_admin": true, 00:32:55.201 "nvme_io": true, 00:32:55.201 "nvme_io_md": false, 00:32:55.201 "write_zeroes": true, 00:32:55.201 "zcopy": false, 00:32:55.201 "get_zone_info": false, 00:32:55.201 "zone_management": false, 00:32:55.201 "zone_append": false, 00:32:55.201 "compare": false, 00:32:55.201 "compare_and_write": false, 00:32:55.201 "abort": true, 00:32:55.201 "seek_hole": false, 00:32:55.201 "seek_data": false, 00:32:55.201 "copy": false, 00:32:55.201 "nvme_iov_md": false 00:32:55.201 }, 00:32:55.201 "driver_specific": { 00:32:55.201 "nvme": [ 00:32:55.201 { 00:32:55.201 "pci_address": "0000:5e:00.0", 00:32:55.201 "trid": { 00:32:55.201 "trtype": "PCIe", 00:32:55.201 "traddr": "0000:5e:00.0" 00:32:55.201 }, 00:32:55.201 "ctrlr_data": { 00:32:55.201 "cntlid": 0, 00:32:55.201 "vendor_id": "0x8086", 00:32:55.201 "model_number": "INTEL SSDPF2KX076TZO", 00:32:55.201 "serial_number": "PHAC0301002G7P6CGN", 00:32:55.201 "firmware_revision": "JCV10200", 00:32:55.201 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:55.201 "oacs": { 00:32:55.201 "security": 1, 00:32:55.201 "format": 1, 00:32:55.201 "firmware": 1, 00:32:55.201 "ns_manage": 1 00:32:55.201 }, 00:32:55.201 "multi_ctrlr": false, 00:32:55.201 "ana_reporting": false 00:32:55.201 }, 00:32:55.201 "vs": { 00:32:55.201 "nvme_version": "1.3" 00:32:55.201 }, 00:32:55.201 "ns_data": { 00:32:55.201 "id": 1, 00:32:55.201 "can_share": false 00:32:55.201 }, 00:32:55.201 "security": { 00:32:55.201 "opal": true 00:32:55.201 } 00:32:55.201 } 00:32:55.201 ], 00:32:55.201 "mp_policy": "active_passive" 00:32:55.201 } 00:32:55.201 } 00:32:55.201 ] 00:32:55.201 06:01:10 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:55.201 06:01:10 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:57.735 f1e6d66b-fe1e-45ea-a7fd-65027f98e71f 00:32:57.735 06:01:12 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:58.060 01e3d845-d201-47e1-a53d-42912d9515f0 00:32:58.060 06:01:12 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:58.060 06:01:12 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:58.060 06:01:12 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:58.060 06:01:12 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:58.060 06:01:12 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:58.060 06:01:12 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:58.060 06:01:12 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:58.060 06:01:12 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:58.319 [ 00:32:58.319 { 00:32:58.319 "name": "01e3d845-d201-47e1-a53d-42912d9515f0", 00:32:58.319 "aliases": [ 00:32:58.319 "lvs0/lv0" 00:32:58.319 ], 00:32:58.319 "product_name": "Logical Volume", 00:32:58.319 "block_size": 512, 00:32:58.319 "num_blocks": 204800, 00:32:58.319 "uuid": "01e3d845-d201-47e1-a53d-42912d9515f0", 00:32:58.319 "assigned_rate_limits": { 00:32:58.319 "rw_ios_per_sec": 0, 00:32:58.319 "rw_mbytes_per_sec": 0, 00:32:58.319 "r_mbytes_per_sec": 0, 00:32:58.319 "w_mbytes_per_sec": 0 00:32:58.319 }, 00:32:58.319 "claimed": false, 00:32:58.319 "zoned": false, 00:32:58.319 "supported_io_types": { 00:32:58.319 "read": true, 00:32:58.319 "write": true, 00:32:58.319 "unmap": true, 00:32:58.319 "flush": false, 00:32:58.319 "reset": true, 00:32:58.319 "nvme_admin": false, 00:32:58.319 "nvme_io": false, 00:32:58.319 "nvme_io_md": false, 00:32:58.319 "write_zeroes": true, 00:32:58.319 "zcopy": false, 00:32:58.319 "get_zone_info": false, 00:32:58.319 "zone_management": false, 00:32:58.319 "zone_append": false, 00:32:58.319 "compare": false, 00:32:58.319 "compare_and_write": false, 00:32:58.319 "abort": false, 00:32:58.319 "seek_hole": true, 00:32:58.319 "seek_data": true, 00:32:58.319 "copy": false, 00:32:58.319 "nvme_iov_md": false 00:32:58.319 }, 00:32:58.319 "driver_specific": { 00:32:58.319 "lvol": { 00:32:58.319 "lvol_store_uuid": "f1e6d66b-fe1e-45ea-a7fd-65027f98e71f", 00:32:58.319 "base_bdev": "Nvme0n1", 00:32:58.319 "thin_provision": true, 00:32:58.319 "num_allocated_clusters": 0, 00:32:58.319 "snapshot": false, 00:32:58.319 "clone": false, 00:32:58.319 "esnap_clone": false 00:32:58.319 } 00:32:58.319 } 00:32:58.319 } 00:32:58.319 ] 00:32:58.319 06:01:13 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:58.319 06:01:13 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:32:58.319 06:01:13 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:32:58.577 [2024-07-26 06:01:13.332116] vbdev_compress.c:1034:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:58.577 COMP_lvs0/lv0 00:32:58.577 06:01:13 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:58.577 06:01:13 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:58.577 06:01:13 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:58.577 06:01:13 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:58.577 06:01:13 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:58.577 06:01:13 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:58.577 06:01:13 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:58.835 06:01:13 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:59.094 [ 00:32:59.094 { 00:32:59.094 "name": "COMP_lvs0/lv0", 00:32:59.094 "aliases": [ 00:32:59.094 "57405d90-9956-56e2-8ea3-f449c3e702fc" 00:32:59.094 ], 00:32:59.094 "product_name": "compress", 00:32:59.094 "block_size": 4096, 00:32:59.094 "num_blocks": 25088, 00:32:59.094 "uuid": "57405d90-9956-56e2-8ea3-f449c3e702fc", 00:32:59.094 "assigned_rate_limits": { 00:32:59.094 "rw_ios_per_sec": 0, 00:32:59.094 "rw_mbytes_per_sec": 0, 00:32:59.094 "r_mbytes_per_sec": 0, 00:32:59.094 "w_mbytes_per_sec": 0 00:32:59.094 }, 00:32:59.094 "claimed": false, 00:32:59.094 "zoned": false, 00:32:59.094 "supported_io_types": { 00:32:59.094 "read": true, 00:32:59.094 "write": true, 00:32:59.094 "unmap": false, 00:32:59.094 "flush": false, 00:32:59.094 "reset": false, 00:32:59.094 "nvme_admin": false, 00:32:59.094 "nvme_io": false, 00:32:59.094 "nvme_io_md": false, 00:32:59.094 "write_zeroes": true, 00:32:59.094 "zcopy": false, 00:32:59.094 "get_zone_info": false, 00:32:59.094 "zone_management": false, 00:32:59.094 "zone_append": false, 00:32:59.094 "compare": false, 00:32:59.094 "compare_and_write": false, 00:32:59.094 "abort": false, 00:32:59.094 "seek_hole": false, 00:32:59.094 "seek_data": false, 00:32:59.094 "copy": false, 00:32:59.094 "nvme_iov_md": false 00:32:59.094 }, 00:32:59.094 "driver_specific": { 00:32:59.094 "compress": { 00:32:59.094 "name": "COMP_lvs0/lv0", 00:32:59.094 "base_bdev_name": "01e3d845-d201-47e1-a53d-42912d9515f0", 00:32:59.094 "pm_path": "/tmp/pmem/fec9683b-051b-4b36-b149-c3122961f789" 00:32:59.094 } 00:32:59.094 } 00:32:59.094 } 00:32:59.094 ] 00:32:59.094 06:01:13 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:59.094 06:01:13 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:59.094 Running I/O for 3 seconds... 00:33:02.379 00:33:02.379 Latency(us) 00:33:02.380 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:02.380 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:33:02.380 Verification LBA range: start 0x0 length 0x3100 00:33:02.380 COMP_lvs0/lv0 : 3.00 3964.37 15.49 0.00 0.00 8019.12 666.05 7465.41 00:33:02.380 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:33:02.380 Verification LBA range: start 0x3100 length 0x3100 00:33:02.380 COMP_lvs0/lv0 : 3.00 3969.70 15.51 0.00 0.00 8020.08 502.21 7579.38 00:33:02.380 =================================================================================================================== 00:33:02.380 Total : 7934.08 30.99 0.00 0.00 8019.60 502.21 7579.38 00:33:02.380 0 00:33:02.380 06:01:16 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:33:02.380 06:01:16 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:33:02.380 06:01:17 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:33:02.638 06:01:17 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:33:02.638 06:01:17 compress_isal -- compress/compress.sh@78 -- # killprocess 1303810 00:33:02.638 06:01:17 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1303810 ']' 00:33:02.638 06:01:17 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1303810 00:33:02.638 06:01:17 compress_isal -- common/autotest_common.sh@953 -- # uname 00:33:02.638 06:01:17 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:02.638 06:01:17 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1303810 00:33:02.638 06:01:17 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:33:02.638 06:01:17 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:33:02.638 06:01:17 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1303810' 00:33:02.638 killing process with pid 1303810 00:33:02.638 06:01:17 compress_isal -- common/autotest_common.sh@967 -- # kill 1303810 00:33:02.638 Received shutdown signal, test time was about 3.000000 seconds 00:33:02.638 00:33:02.639 Latency(us) 00:33:02.639 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:02.639 =================================================================================================================== 00:33:02.639 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:02.639 06:01:17 compress_isal -- common/autotest_common.sh@972 -- # wait 1303810 00:33:05.926 06:01:20 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:33:05.926 06:01:20 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:33:05.926 06:01:20 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1305348 00:33:05.926 06:01:20 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:33:05.926 06:01:20 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:33:05.926 06:01:20 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1305348 00:33:05.926 06:01:20 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1305348 ']' 00:33:05.926 06:01:20 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:05.926 06:01:20 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:05.926 06:01:20 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:05.926 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:05.926 06:01:20 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:05.926 06:01:20 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:33:05.926 [2024-07-26 06:01:20.265843] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:33:05.926 [2024-07-26 06:01:20.265911] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1305348 ] 00:33:05.926 [2024-07-26 06:01:20.395717] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:05.926 [2024-07-26 06:01:20.495165] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:05.926 [2024-07-26 06:01:20.495249] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:05.926 [2024-07-26 06:01:20.495255] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:06.493 06:01:21 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:06.493 06:01:21 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:33:06.493 06:01:21 compress_isal -- compress/compress.sh@58 -- # create_vols 00:33:06.493 06:01:21 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:33:06.493 06:01:21 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:33:07.061 06:01:21 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:33:07.061 06:01:21 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:33:07.061 06:01:21 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:07.061 06:01:21 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:07.061 06:01:21 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:07.061 06:01:21 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:07.061 06:01:21 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:07.319 06:01:22 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:33:07.578 [ 00:33:07.578 { 00:33:07.578 "name": "Nvme0n1", 00:33:07.578 "aliases": [ 00:33:07.578 "01000000-0000-0000-5cd2-e43197705251" 00:33:07.578 ], 00:33:07.578 "product_name": "NVMe disk", 00:33:07.578 "block_size": 512, 00:33:07.578 "num_blocks": 15002931888, 00:33:07.578 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:33:07.578 "assigned_rate_limits": { 00:33:07.578 "rw_ios_per_sec": 0, 00:33:07.578 "rw_mbytes_per_sec": 0, 00:33:07.578 "r_mbytes_per_sec": 0, 00:33:07.578 "w_mbytes_per_sec": 0 00:33:07.578 }, 00:33:07.578 "claimed": false, 00:33:07.578 "zoned": false, 00:33:07.578 "supported_io_types": { 00:33:07.578 "read": true, 00:33:07.578 "write": true, 00:33:07.578 "unmap": true, 00:33:07.578 "flush": true, 00:33:07.578 "reset": true, 00:33:07.578 "nvme_admin": true, 00:33:07.578 "nvme_io": true, 00:33:07.578 "nvme_io_md": false, 00:33:07.578 "write_zeroes": true, 00:33:07.578 "zcopy": false, 00:33:07.578 "get_zone_info": false, 00:33:07.578 "zone_management": false, 00:33:07.578 "zone_append": false, 00:33:07.578 "compare": false, 00:33:07.578 "compare_and_write": false, 00:33:07.578 "abort": true, 00:33:07.578 "seek_hole": false, 00:33:07.578 "seek_data": false, 00:33:07.578 "copy": false, 00:33:07.578 "nvme_iov_md": false 00:33:07.578 }, 00:33:07.578 "driver_specific": { 00:33:07.578 "nvme": [ 00:33:07.578 { 00:33:07.578 "pci_address": "0000:5e:00.0", 00:33:07.578 "trid": { 00:33:07.578 "trtype": "PCIe", 00:33:07.578 "traddr": "0000:5e:00.0" 00:33:07.578 }, 00:33:07.578 "ctrlr_data": { 00:33:07.578 "cntlid": 0, 00:33:07.578 "vendor_id": "0x8086", 00:33:07.578 "model_number": "INTEL SSDPF2KX076TZO", 00:33:07.578 "serial_number": "PHAC0301002G7P6CGN", 00:33:07.578 "firmware_revision": "JCV10200", 00:33:07.578 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:33:07.578 "oacs": { 00:33:07.578 "security": 1, 00:33:07.578 "format": 1, 00:33:07.578 "firmware": 1, 00:33:07.578 "ns_manage": 1 00:33:07.578 }, 00:33:07.578 "multi_ctrlr": false, 00:33:07.578 "ana_reporting": false 00:33:07.578 }, 00:33:07.578 "vs": { 00:33:07.578 "nvme_version": "1.3" 00:33:07.578 }, 00:33:07.578 "ns_data": { 00:33:07.578 "id": 1, 00:33:07.578 "can_share": false 00:33:07.578 }, 00:33:07.579 "security": { 00:33:07.579 "opal": true 00:33:07.579 } 00:33:07.579 } 00:33:07.579 ], 00:33:07.579 "mp_policy": "active_passive" 00:33:07.579 } 00:33:07.579 } 00:33:07.579 ] 00:33:07.579 06:01:22 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:07.579 06:01:22 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:33:10.110 65feba9d-d694-4fee-8755-db7c5e4d45a7 00:33:10.110 06:01:24 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:33:10.110 1b7ab4ba-8d7c-451d-9023-088edf696978 00:33:10.110 06:01:24 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:33:10.110 06:01:24 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:33:10.110 06:01:24 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:10.110 06:01:24 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:10.110 06:01:24 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:10.110 06:01:24 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:10.110 06:01:24 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:10.368 06:01:25 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:33:10.627 [ 00:33:10.627 { 00:33:10.627 "name": "1b7ab4ba-8d7c-451d-9023-088edf696978", 00:33:10.627 "aliases": [ 00:33:10.627 "lvs0/lv0" 00:33:10.627 ], 00:33:10.627 "product_name": "Logical Volume", 00:33:10.627 "block_size": 512, 00:33:10.627 "num_blocks": 204800, 00:33:10.627 "uuid": "1b7ab4ba-8d7c-451d-9023-088edf696978", 00:33:10.627 "assigned_rate_limits": { 00:33:10.627 "rw_ios_per_sec": 0, 00:33:10.627 "rw_mbytes_per_sec": 0, 00:33:10.627 "r_mbytes_per_sec": 0, 00:33:10.627 "w_mbytes_per_sec": 0 00:33:10.627 }, 00:33:10.627 "claimed": false, 00:33:10.627 "zoned": false, 00:33:10.627 "supported_io_types": { 00:33:10.627 "read": true, 00:33:10.627 "write": true, 00:33:10.627 "unmap": true, 00:33:10.627 "flush": false, 00:33:10.627 "reset": true, 00:33:10.627 "nvme_admin": false, 00:33:10.627 "nvme_io": false, 00:33:10.627 "nvme_io_md": false, 00:33:10.627 "write_zeroes": true, 00:33:10.627 "zcopy": false, 00:33:10.627 "get_zone_info": false, 00:33:10.627 "zone_management": false, 00:33:10.627 "zone_append": false, 00:33:10.627 "compare": false, 00:33:10.627 "compare_and_write": false, 00:33:10.627 "abort": false, 00:33:10.627 "seek_hole": true, 00:33:10.627 "seek_data": true, 00:33:10.627 "copy": false, 00:33:10.627 "nvme_iov_md": false 00:33:10.627 }, 00:33:10.627 "driver_specific": { 00:33:10.627 "lvol": { 00:33:10.627 "lvol_store_uuid": "65feba9d-d694-4fee-8755-db7c5e4d45a7", 00:33:10.627 "base_bdev": "Nvme0n1", 00:33:10.627 "thin_provision": true, 00:33:10.627 "num_allocated_clusters": 0, 00:33:10.627 "snapshot": false, 00:33:10.627 "clone": false, 00:33:10.627 "esnap_clone": false 00:33:10.627 } 00:33:10.627 } 00:33:10.627 } 00:33:10.627 ] 00:33:10.627 06:01:25 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:10.627 06:01:25 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:33:10.627 06:01:25 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:33:10.886 [2024-07-26 06:01:25.700134] vbdev_compress.c:1034:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:33:10.886 COMP_lvs0/lv0 00:33:10.886 06:01:25 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:33:10.886 06:01:25 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:33:10.886 06:01:25 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:10.886 06:01:25 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:10.886 06:01:25 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:10.886 06:01:25 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:10.886 06:01:25 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:11.143 06:01:25 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:33:11.400 [ 00:33:11.400 { 00:33:11.400 "name": "COMP_lvs0/lv0", 00:33:11.400 "aliases": [ 00:33:11.401 "5b355d50-f740-510d-a17e-e74fc3d92d1f" 00:33:11.401 ], 00:33:11.401 "product_name": "compress", 00:33:11.401 "block_size": 512, 00:33:11.401 "num_blocks": 200704, 00:33:11.401 "uuid": "5b355d50-f740-510d-a17e-e74fc3d92d1f", 00:33:11.401 "assigned_rate_limits": { 00:33:11.401 "rw_ios_per_sec": 0, 00:33:11.401 "rw_mbytes_per_sec": 0, 00:33:11.401 "r_mbytes_per_sec": 0, 00:33:11.401 "w_mbytes_per_sec": 0 00:33:11.401 }, 00:33:11.401 "claimed": false, 00:33:11.401 "zoned": false, 00:33:11.401 "supported_io_types": { 00:33:11.401 "read": true, 00:33:11.401 "write": true, 00:33:11.401 "unmap": false, 00:33:11.401 "flush": false, 00:33:11.401 "reset": false, 00:33:11.401 "nvme_admin": false, 00:33:11.401 "nvme_io": false, 00:33:11.401 "nvme_io_md": false, 00:33:11.401 "write_zeroes": true, 00:33:11.401 "zcopy": false, 00:33:11.401 "get_zone_info": false, 00:33:11.401 "zone_management": false, 00:33:11.401 "zone_append": false, 00:33:11.401 "compare": false, 00:33:11.401 "compare_and_write": false, 00:33:11.401 "abort": false, 00:33:11.401 "seek_hole": false, 00:33:11.401 "seek_data": false, 00:33:11.401 "copy": false, 00:33:11.401 "nvme_iov_md": false 00:33:11.401 }, 00:33:11.401 "driver_specific": { 00:33:11.401 "compress": { 00:33:11.401 "name": "COMP_lvs0/lv0", 00:33:11.401 "base_bdev_name": "1b7ab4ba-8d7c-451d-9023-088edf696978", 00:33:11.401 "pm_path": "/tmp/pmem/3304cfc9-8590-4187-bb38-652fee8f6ba0" 00:33:11.401 } 00:33:11.401 } 00:33:11.401 } 00:33:11.401 ] 00:33:11.401 06:01:26 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:11.401 06:01:26 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:11.659 I/O targets: 00:33:11.659 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:33:11.659 00:33:11.659 00:33:11.659 CUnit - A unit testing framework for C - Version 2.1-3 00:33:11.659 http://cunit.sourceforge.net/ 00:33:11.659 00:33:11.659 00:33:11.659 Suite: bdevio tests on: COMP_lvs0/lv0 00:33:11.659 Test: blockdev write read block ...passed 00:33:11.659 Test: blockdev write zeroes read block ...passed 00:33:11.659 Test: blockdev write zeroes read no split ...passed 00:33:11.659 Test: blockdev write zeroes read split ...passed 00:33:11.659 Test: blockdev write zeroes read split partial ...passed 00:33:11.659 Test: blockdev reset ...[2024-07-26 06:01:26.375031] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:33:11.659 passed 00:33:11.659 Test: blockdev write read 8 blocks ...passed 00:33:11.659 Test: blockdev write read size > 128k ...passed 00:33:11.659 Test: blockdev write read invalid size ...passed 00:33:11.659 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:11.659 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:11.659 Test: blockdev write read max offset ...passed 00:33:11.659 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:11.659 Test: blockdev writev readv 8 blocks ...passed 00:33:11.659 Test: blockdev writev readv 30 x 1block ...passed 00:33:11.659 Test: blockdev writev readv block ...passed 00:33:11.659 Test: blockdev writev readv size > 128k ...passed 00:33:11.659 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:11.659 Test: blockdev comparev and writev ...passed 00:33:11.659 Test: blockdev nvme passthru rw ...passed 00:33:11.659 Test: blockdev nvme passthru vendor specific ...passed 00:33:11.659 Test: blockdev nvme admin passthru ...passed 00:33:11.659 Test: blockdev copy ...passed 00:33:11.659 00:33:11.659 Run Summary: Type Total Ran Passed Failed Inactive 00:33:11.659 suites 1 1 n/a 0 0 00:33:11.659 tests 23 23 23 0 0 00:33:11.659 asserts 130 130 130 0 n/a 00:33:11.659 00:33:11.659 Elapsed time = 0.108 seconds 00:33:11.659 0 00:33:11.659 06:01:26 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:33:11.659 06:01:26 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:33:11.918 06:01:26 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:33:12.179 06:01:26 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:33:12.179 06:01:26 compress_isal -- compress/compress.sh@62 -- # killprocess 1305348 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1305348 ']' 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1305348 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@953 -- # uname 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1305348 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1305348' 00:33:12.179 killing process with pid 1305348 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@967 -- # kill 1305348 00:33:12.179 06:01:26 compress_isal -- common/autotest_common.sh@972 -- # wait 1305348 00:33:15.464 06:01:29 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:33:15.464 06:01:29 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:33:15.464 00:33:15.464 real 0m46.460s 00:33:15.464 user 1m48.711s 00:33:15.464 sys 0m4.268s 00:33:15.464 06:01:29 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:15.464 06:01:29 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:33:15.464 ************************************ 00:33:15.464 END TEST compress_isal 00:33:15.464 ************************************ 00:33:15.464 06:01:29 -- common/autotest_common.sh@1142 -- # return 0 00:33:15.464 06:01:29 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:33:15.464 06:01:29 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:33:15.464 06:01:29 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:33:15.464 06:01:29 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:15.464 06:01:29 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:15.464 06:01:29 -- common/autotest_common.sh@10 -- # set +x 00:33:15.464 ************************************ 00:33:15.464 START TEST blockdev_crypto_aesni 00:33:15.464 ************************************ 00:33:15.464 06:01:29 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:33:15.464 * Looking for test storage... 00:33:15.464 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # uname -s 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@681 -- # test_type=crypto_aesni 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # crypto_device= 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # dek= 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # env_ctx= 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == bdev ]] 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == crypto_* ]] 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1306605 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:15.464 06:01:29 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1306605 00:33:15.464 06:01:29 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 1306605 ']' 00:33:15.464 06:01:29 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:15.464 06:01:29 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:15.464 06:01:29 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:15.464 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:15.464 06:01:29 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:15.464 06:01:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:15.464 [2024-07-26 06:01:29.917445] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:33:15.464 [2024-07-26 06:01:29.917520] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1306605 ] 00:33:15.464 [2024-07-26 06:01:30.046292] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:15.464 [2024-07-26 06:01:30.151029] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:16.030 06:01:30 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:16.030 06:01:30 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:33:16.030 06:01:30 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:33:16.030 06:01:30 blockdev_crypto_aesni -- bdev/blockdev.sh@704 -- # setup_crypto_aesni_conf 00:33:16.030 06:01:30 blockdev_crypto_aesni -- bdev/blockdev.sh@145 -- # rpc_cmd 00:33:16.030 06:01:30 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:16.030 06:01:30 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:16.030 [2024-07-26 06:01:30.865312] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:16.030 [2024-07-26 06:01:30.873346] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:16.030 [2024-07-26 06:01:30.881363] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:16.289 [2024-07-26 06:01:30.944779] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:18.852 true 00:33:18.852 true 00:33:18.852 true 00:33:18.852 true 00:33:18.852 Malloc0 00:33:18.852 Malloc1 00:33:18.852 Malloc2 00:33:18.852 Malloc3 00:33:18.852 [2024-07-26 06:01:33.319710] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:18.852 crypto_ram 00:33:18.852 [2024-07-26 06:01:33.327744] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:18.852 crypto_ram2 00:33:18.852 [2024-07-26 06:01:33.335751] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:18.852 crypto_ram3 00:33:18.852 [2024-07-26 06:01:33.343775] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:18.852 crypto_ram4 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # cat 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:18.852 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:33:18.852 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r .name 00:33:18.853 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ab4f4e5c-6a8a-5db9-806a-cc7e61433b3f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ab4f4e5c-6a8a-5db9-806a-cc7e61433b3f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "9ba5e599-eb35-5e52-86e9-d17d684e0aff"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9ba5e599-eb35-5e52-86e9-d17d684e0aff",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "585a4c3f-aca0-570e-ae8a-49c47aa1b6cb"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "585a4c3f-aca0-570e-ae8a-49c47aa1b6cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "d1645500-2f0d-58fb-b05b-031748e9450d"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d1645500-2f0d-58fb-b05b-031748e9450d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:18.853 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:33:18.853 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:33:18.853 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:33:18.853 06:01:33 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # killprocess 1306605 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 1306605 ']' 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 1306605 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1306605 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1306605' 00:33:18.853 killing process with pid 1306605 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 1306605 00:33:18.853 06:01:33 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 1306605 00:33:19.420 06:01:34 blockdev_crypto_aesni -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:19.420 06:01:34 blockdev_crypto_aesni -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:19.420 06:01:34 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:19.420 06:01:34 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:19.420 06:01:34 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:19.420 ************************************ 00:33:19.420 START TEST bdev_hello_world 00:33:19.420 ************************************ 00:33:19.420 06:01:34 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:19.420 [2024-07-26 06:01:34.312430] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:33:19.420 [2024-07-26 06:01:34.312482] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1307253 ] 00:33:19.678 [2024-07-26 06:01:34.421866] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:19.678 [2024-07-26 06:01:34.529624] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:19.678 [2024-07-26 06:01:34.550975] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:19.678 [2024-07-26 06:01:34.559002] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:19.678 [2024-07-26 06:01:34.567020] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:19.935 [2024-07-26 06:01:34.671770] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:22.467 [2024-07-26 06:01:36.908580] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:22.467 [2024-07-26 06:01:36.908656] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:22.467 [2024-07-26 06:01:36.908672] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:22.467 [2024-07-26 06:01:36.916602] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:22.467 [2024-07-26 06:01:36.916621] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:22.467 [2024-07-26 06:01:36.916633] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:22.467 [2024-07-26 06:01:36.924622] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:22.467 [2024-07-26 06:01:36.924645] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:22.467 [2024-07-26 06:01:36.924657] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:22.467 [2024-07-26 06:01:36.932648] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:22.467 [2024-07-26 06:01:36.932665] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:22.467 [2024-07-26 06:01:36.932677] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:22.467 [2024-07-26 06:01:37.005515] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:22.467 [2024-07-26 06:01:37.005564] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:22.467 [2024-07-26 06:01:37.005583] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:22.467 [2024-07-26 06:01:37.006861] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:22.467 [2024-07-26 06:01:37.006932] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:22.467 [2024-07-26 06:01:37.006948] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:22.467 [2024-07-26 06:01:37.006993] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:22.467 00:33:22.467 [2024-07-26 06:01:37.007012] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:22.729 00:33:22.729 real 0m3.128s 00:33:22.729 user 0m2.731s 00:33:22.729 sys 0m0.364s 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:22.729 ************************************ 00:33:22.729 END TEST bdev_hello_world 00:33:22.729 ************************************ 00:33:22.729 06:01:37 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:22.729 06:01:37 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:33:22.729 06:01:37 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:22.729 06:01:37 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:22.729 06:01:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:22.729 ************************************ 00:33:22.729 START TEST bdev_bounds 00:33:22.729 ************************************ 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1307622 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1307622' 00:33:22.729 Process bdevio pid: 1307622 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1307622 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1307622 ']' 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:22.729 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:22.729 06:01:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:22.729 [2024-07-26 06:01:37.563724] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:33:22.729 [2024-07-26 06:01:37.563861] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1307622 ] 00:33:22.988 [2024-07-26 06:01:37.760210] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:22.988 [2024-07-26 06:01:37.860204] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:22.988 [2024-07-26 06:01:37.860289] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:22.988 [2024-07-26 06:01:37.860295] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:22.988 [2024-07-26 06:01:37.881676] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:22.988 [2024-07-26 06:01:37.889700] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:23.247 [2024-07-26 06:01:37.897720] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:23.247 [2024-07-26 06:01:37.998433] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:25.783 [2024-07-26 06:01:40.212697] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:25.783 [2024-07-26 06:01:40.212775] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:25.783 [2024-07-26 06:01:40.212791] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:25.783 [2024-07-26 06:01:40.220712] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:25.783 [2024-07-26 06:01:40.220731] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:25.783 [2024-07-26 06:01:40.220742] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:25.783 [2024-07-26 06:01:40.228734] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:25.783 [2024-07-26 06:01:40.228751] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:25.784 [2024-07-26 06:01:40.228762] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:25.784 [2024-07-26 06:01:40.236758] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:25.784 [2024-07-26 06:01:40.236774] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:25.784 [2024-07-26 06:01:40.236786] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:25.784 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:25.784 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:25.784 06:01:40 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:25.784 I/O targets: 00:33:25.784 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:33:25.784 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:33:25.784 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:33:25.784 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:33:25.784 00:33:25.784 00:33:25.784 CUnit - A unit testing framework for C - Version 2.1-3 00:33:25.784 http://cunit.sourceforge.net/ 00:33:25.784 00:33:25.784 00:33:25.784 Suite: bdevio tests on: crypto_ram4 00:33:25.784 Test: blockdev write read block ...passed 00:33:25.784 Test: blockdev write zeroes read block ...passed 00:33:25.784 Test: blockdev write zeroes read no split ...passed 00:33:25.784 Test: blockdev write zeroes read split ...passed 00:33:25.784 Test: blockdev write zeroes read split partial ...passed 00:33:25.784 Test: blockdev reset ...passed 00:33:25.784 Test: blockdev write read 8 blocks ...passed 00:33:25.784 Test: blockdev write read size > 128k ...passed 00:33:25.784 Test: blockdev write read invalid size ...passed 00:33:25.784 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:25.784 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:25.784 Test: blockdev write read max offset ...passed 00:33:25.784 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:25.784 Test: blockdev writev readv 8 blocks ...passed 00:33:25.784 Test: blockdev writev readv 30 x 1block ...passed 00:33:25.784 Test: blockdev writev readv block ...passed 00:33:25.784 Test: blockdev writev readv size > 128k ...passed 00:33:25.784 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:25.784 Test: blockdev comparev and writev ...passed 00:33:25.784 Test: blockdev nvme passthru rw ...passed 00:33:25.784 Test: blockdev nvme passthru vendor specific ...passed 00:33:25.784 Test: blockdev nvme admin passthru ...passed 00:33:25.784 Test: blockdev copy ...passed 00:33:25.784 Suite: bdevio tests on: crypto_ram3 00:33:25.784 Test: blockdev write read block ...passed 00:33:25.784 Test: blockdev write zeroes read block ...passed 00:33:25.784 Test: blockdev write zeroes read no split ...passed 00:33:25.784 Test: blockdev write zeroes read split ...passed 00:33:25.784 Test: blockdev write zeroes read split partial ...passed 00:33:25.784 Test: blockdev reset ...passed 00:33:25.784 Test: blockdev write read 8 blocks ...passed 00:33:25.784 Test: blockdev write read size > 128k ...passed 00:33:25.784 Test: blockdev write read invalid size ...passed 00:33:25.784 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:25.784 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:25.784 Test: blockdev write read max offset ...passed 00:33:25.784 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:25.784 Test: blockdev writev readv 8 blocks ...passed 00:33:25.784 Test: blockdev writev readv 30 x 1block ...passed 00:33:25.784 Test: blockdev writev readv block ...passed 00:33:25.784 Test: blockdev writev readv size > 128k ...passed 00:33:25.784 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:25.784 Test: blockdev comparev and writev ...passed 00:33:25.784 Test: blockdev nvme passthru rw ...passed 00:33:25.784 Test: blockdev nvme passthru vendor specific ...passed 00:33:25.784 Test: blockdev nvme admin passthru ...passed 00:33:25.784 Test: blockdev copy ...passed 00:33:25.784 Suite: bdevio tests on: crypto_ram2 00:33:25.784 Test: blockdev write read block ...passed 00:33:25.784 Test: blockdev write zeroes read block ...passed 00:33:25.784 Test: blockdev write zeroes read no split ...passed 00:33:25.784 Test: blockdev write zeroes read split ...passed 00:33:25.784 Test: blockdev write zeroes read split partial ...passed 00:33:25.784 Test: blockdev reset ...passed 00:33:25.784 Test: blockdev write read 8 blocks ...passed 00:33:25.784 Test: blockdev write read size > 128k ...passed 00:33:25.784 Test: blockdev write read invalid size ...passed 00:33:25.784 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:25.784 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:25.784 Test: blockdev write read max offset ...passed 00:33:25.784 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:25.784 Test: blockdev writev readv 8 blocks ...passed 00:33:25.784 Test: blockdev writev readv 30 x 1block ...passed 00:33:25.784 Test: blockdev writev readv block ...passed 00:33:25.784 Test: blockdev writev readv size > 128k ...passed 00:33:25.784 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:25.784 Test: blockdev comparev and writev ...passed 00:33:25.784 Test: blockdev nvme passthru rw ...passed 00:33:25.784 Test: blockdev nvme passthru vendor specific ...passed 00:33:25.784 Test: blockdev nvme admin passthru ...passed 00:33:25.784 Test: blockdev copy ...passed 00:33:25.784 Suite: bdevio tests on: crypto_ram 00:33:25.784 Test: blockdev write read block ...passed 00:33:25.784 Test: blockdev write zeroes read block ...passed 00:33:25.784 Test: blockdev write zeroes read no split ...passed 00:33:25.784 Test: blockdev write zeroes read split ...passed 00:33:25.784 Test: blockdev write zeroes read split partial ...passed 00:33:25.784 Test: blockdev reset ...passed 00:33:25.784 Test: blockdev write read 8 blocks ...passed 00:33:25.784 Test: blockdev write read size > 128k ...passed 00:33:25.784 Test: blockdev write read invalid size ...passed 00:33:25.784 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:25.784 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:25.784 Test: blockdev write read max offset ...passed 00:33:25.784 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:25.784 Test: blockdev writev readv 8 blocks ...passed 00:33:25.784 Test: blockdev writev readv 30 x 1block ...passed 00:33:25.784 Test: blockdev writev readv block ...passed 00:33:25.784 Test: blockdev writev readv size > 128k ...passed 00:33:25.784 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:25.784 Test: blockdev comparev and writev ...passed 00:33:25.784 Test: blockdev nvme passthru rw ...passed 00:33:25.784 Test: blockdev nvme passthru vendor specific ...passed 00:33:25.784 Test: blockdev nvme admin passthru ...passed 00:33:25.784 Test: blockdev copy ...passed 00:33:25.784 00:33:25.784 Run Summary: Type Total Ran Passed Failed Inactive 00:33:25.784 suites 4 4 n/a 0 0 00:33:25.784 tests 92 92 92 0 0 00:33:25.784 asserts 520 520 520 0 n/a 00:33:25.784 00:33:25.784 Elapsed time = 0.546 seconds 00:33:25.784 0 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1307622 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1307622 ']' 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1307622 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1307622 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1307622' 00:33:26.044 killing process with pid 1307622 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1307622 00:33:26.044 06:01:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1307622 00:33:26.303 06:01:41 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:33:26.303 00:33:26.303 real 0m3.734s 00:33:26.303 user 0m10.011s 00:33:26.303 sys 0m0.612s 00:33:26.303 06:01:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:26.303 06:01:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:26.303 ************************************ 00:33:26.303 END TEST bdev_bounds 00:33:26.303 ************************************ 00:33:26.562 06:01:41 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:26.562 06:01:41 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:33:26.562 06:01:41 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:26.562 06:01:41 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:26.562 06:01:41 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:26.562 ************************************ 00:33:26.562 START TEST bdev_nbd 00:33:26.562 ************************************ 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:26.562 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1308170 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1308170 /var/tmp/spdk-nbd.sock 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1308170 ']' 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:26.563 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:26.563 06:01:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:26.563 [2024-07-26 06:01:41.313525] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:33:26.563 [2024-07-26 06:01:41.313568] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:26.563 [2024-07-26 06:01:41.428152] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:26.822 [2024-07-26 06:01:41.536128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:26.822 [2024-07-26 06:01:41.557454] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:26.822 [2024-07-26 06:01:41.565474] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:26.822 [2024-07-26 06:01:41.573493] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:26.822 [2024-07-26 06:01:41.677889] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:29.355 [2024-07-26 06:01:43.906038] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:29.355 [2024-07-26 06:01:43.906095] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:29.355 [2024-07-26 06:01:43.906110] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:29.355 [2024-07-26 06:01:43.914059] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:29.355 [2024-07-26 06:01:43.914078] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:29.355 [2024-07-26 06:01:43.914090] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:29.355 [2024-07-26 06:01:43.922078] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:29.355 [2024-07-26 06:01:43.922096] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:29.355 [2024-07-26 06:01:43.922107] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:29.355 [2024-07-26 06:01:43.930099] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:29.355 [2024-07-26 06:01:43.930118] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:29.355 [2024-07-26 06:01:43.930129] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:29.648 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:29.648 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:29.648 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:33:29.648 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:29.649 1+0 records in 00:33:29.649 1+0 records out 00:33:29.649 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292871 s, 14.0 MB/s 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:29.649 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:29.908 1+0 records in 00:33:29.908 1+0 records out 00:33:29.908 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312892 s, 13.1 MB/s 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:29.908 06:01:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:30.167 1+0 records in 00:33:30.167 1+0 records out 00:33:30.167 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306015 s, 13.4 MB/s 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:30.167 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:30.427 1+0 records in 00:33:30.427 1+0 records out 00:33:30.427 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000321261 s, 12.7 MB/s 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:30.427 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:30.686 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:30.686 { 00:33:30.686 "nbd_device": "/dev/nbd0", 00:33:30.686 "bdev_name": "crypto_ram" 00:33:30.686 }, 00:33:30.686 { 00:33:30.686 "nbd_device": "/dev/nbd1", 00:33:30.686 "bdev_name": "crypto_ram2" 00:33:30.686 }, 00:33:30.686 { 00:33:30.686 "nbd_device": "/dev/nbd2", 00:33:30.686 "bdev_name": "crypto_ram3" 00:33:30.686 }, 00:33:30.686 { 00:33:30.686 "nbd_device": "/dev/nbd3", 00:33:30.686 "bdev_name": "crypto_ram4" 00:33:30.686 } 00:33:30.686 ]' 00:33:30.686 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:30.686 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:30.686 { 00:33:30.686 "nbd_device": "/dev/nbd0", 00:33:30.686 "bdev_name": "crypto_ram" 00:33:30.686 }, 00:33:30.686 { 00:33:30.686 "nbd_device": "/dev/nbd1", 00:33:30.686 "bdev_name": "crypto_ram2" 00:33:30.686 }, 00:33:30.686 { 00:33:30.686 "nbd_device": "/dev/nbd2", 00:33:30.686 "bdev_name": "crypto_ram3" 00:33:30.686 }, 00:33:30.686 { 00:33:30.686 "nbd_device": "/dev/nbd3", 00:33:30.686 "bdev_name": "crypto_ram4" 00:33:30.686 } 00:33:30.686 ]' 00:33:30.686 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:30.944 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:33:30.944 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:30.944 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:33:30.944 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:30.944 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:30.944 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:30.944 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:31.203 06:01:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:31.462 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:33:31.720 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:33:31.720 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:33:31.721 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:33:31.721 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:31.721 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:31.721 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:33:31.721 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:31.721 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:31.721 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:31.721 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:31.979 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:32.237 06:01:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:32.496 /dev/nbd0 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:32.496 1+0 records in 00:33:32.496 1+0 records out 00:33:32.496 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272524 s, 15.0 MB/s 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:32.496 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:33:32.755 /dev/nbd1 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:32.755 1+0 records in 00:33:32.755 1+0 records out 00:33:32.755 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289606 s, 14.1 MB/s 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:32.755 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:33:33.014 /dev/nbd10 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:33.014 1+0 records in 00:33:33.014 1+0 records out 00:33:33.014 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339079 s, 12.1 MB/s 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:33.014 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:33:33.273 /dev/nbd11 00:33:33.273 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:33:33.273 06:01:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:33:33.273 06:01:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:33:33.273 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:33.273 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:33.274 1+0 records in 00:33:33.274 1+0 records out 00:33:33.274 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00029471 s, 13.9 MB/s 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:33.274 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:33.533 { 00:33:33.533 "nbd_device": "/dev/nbd0", 00:33:33.533 "bdev_name": "crypto_ram" 00:33:33.533 }, 00:33:33.533 { 00:33:33.533 "nbd_device": "/dev/nbd1", 00:33:33.533 "bdev_name": "crypto_ram2" 00:33:33.533 }, 00:33:33.533 { 00:33:33.533 "nbd_device": "/dev/nbd10", 00:33:33.533 "bdev_name": "crypto_ram3" 00:33:33.533 }, 00:33:33.533 { 00:33:33.533 "nbd_device": "/dev/nbd11", 00:33:33.533 "bdev_name": "crypto_ram4" 00:33:33.533 } 00:33:33.533 ]' 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:33.533 { 00:33:33.533 "nbd_device": "/dev/nbd0", 00:33:33.533 "bdev_name": "crypto_ram" 00:33:33.533 }, 00:33:33.533 { 00:33:33.533 "nbd_device": "/dev/nbd1", 00:33:33.533 "bdev_name": "crypto_ram2" 00:33:33.533 }, 00:33:33.533 { 00:33:33.533 "nbd_device": "/dev/nbd10", 00:33:33.533 "bdev_name": "crypto_ram3" 00:33:33.533 }, 00:33:33.533 { 00:33:33.533 "nbd_device": "/dev/nbd11", 00:33:33.533 "bdev_name": "crypto_ram4" 00:33:33.533 } 00:33:33.533 ]' 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:33.533 /dev/nbd1 00:33:33.533 /dev/nbd10 00:33:33.533 /dev/nbd11' 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:33.533 /dev/nbd1 00:33:33.533 /dev/nbd10 00:33:33.533 /dev/nbd11' 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:33.533 256+0 records in 00:33:33.533 256+0 records out 00:33:33.533 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110568 s, 94.8 MB/s 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:33.533 256+0 records in 00:33:33.533 256+0 records out 00:33:33.533 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0559604 s, 18.7 MB/s 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:33.533 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:33.792 256+0 records in 00:33:33.792 256+0 records out 00:33:33.792 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0504647 s, 20.8 MB/s 00:33:33.792 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:33.792 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:33:33.792 256+0 records in 00:33:33.792 256+0 records out 00:33:33.792 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0454221 s, 23.1 MB/s 00:33:33.792 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:33.792 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:33:33.792 256+0 records in 00:33:33.792 256+0 records out 00:33:33.792 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0410403 s, 25.5 MB/s 00:33:33.792 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:33:33.792 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:33.792 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:33.792 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:33.793 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:34.052 06:01:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:34.310 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:34.569 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:34.828 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:35.117 06:01:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:35.376 malloc_lvol_verify 00:33:35.376 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:35.634 62c7d962-ef08-4f8d-9056-bf0a629892ba 00:33:35.634 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:35.634 75eb7460-2727-4477-8230-001579323370 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:35.894 /dev/nbd0 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:35.894 mke2fs 1.46.5 (30-Dec-2021) 00:33:35.894 Discarding device blocks: 0/4096 done 00:33:35.894 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:35.894 00:33:35.894 Allocating group tables: 0/1 done 00:33:35.894 Writing inode tables: 0/1 done 00:33:35.894 Creating journal (1024 blocks): done 00:33:35.894 Writing superblocks and filesystem accounting information: 0/1 done 00:33:35.894 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:35.894 06:01:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1308170 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1308170 ']' 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1308170 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:36.151 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1308170 00:33:36.410 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:36.410 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:36.410 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1308170' 00:33:36.410 killing process with pid 1308170 00:33:36.410 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1308170 00:33:36.410 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1308170 00:33:36.668 06:01:51 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:33:36.668 00:33:36.668 real 0m10.249s 00:33:36.668 user 0m13.498s 00:33:36.668 sys 0m3.977s 00:33:36.668 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:36.668 06:01:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:36.668 ************************************ 00:33:36.668 END TEST bdev_nbd 00:33:36.668 ************************************ 00:33:36.668 06:01:51 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:36.668 06:01:51 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:33:36.668 06:01:51 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = nvme ']' 00:33:36.668 06:01:51 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = gpt ']' 00:33:36.668 06:01:51 blockdev_crypto_aesni -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:33:36.668 06:01:51 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:36.668 06:01:51 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:36.668 06:01:51 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:36.927 ************************************ 00:33:36.927 START TEST bdev_fio 00:33:36.927 ************************************ 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:36.927 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram4]' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram4 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:36.927 ************************************ 00:33:36.927 START TEST bdev_fio_rw_verify 00:33:36.927 ************************************ 00:33:36.927 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:36.928 06:01:51 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:37.187 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:37.187 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:37.187 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:37.187 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:37.187 fio-3.35 00:33:37.187 Starting 4 threads 00:33:52.070 00:33:52.070 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1310118: Fri Jul 26 06:02:04 2024 00:33:52.070 read: IOPS=19.2k, BW=75.0MiB/s (78.6MB/s)(750MiB/10001msec) 00:33:52.070 slat (usec): min=17, max=2084, avg=72.80, stdev=28.57 00:33:52.070 clat (usec): min=13, max=3464, avg=384.01, stdev=203.92 00:33:52.070 lat (usec): min=59, max=3654, avg=456.81, stdev=215.13 00:33:52.070 clat percentiles (usec): 00:33:52.070 | 50.000th=[ 359], 99.000th=[ 898], 99.900th=[ 1287], 99.990th=[ 1582], 00:33:52.070 | 99.999th=[ 3326] 00:33:52.070 write: IOPS=21.0k, BW=82.1MiB/s (86.1MB/s)(800MiB/9746msec); 0 zone resets 00:33:52.070 slat (usec): min=25, max=438, avg=83.72, stdev=27.98 00:33:52.070 clat (usec): min=36, max=1923, avg=448.10, stdev=232.97 00:33:52.070 lat (usec): min=66, max=2107, avg=531.83, stdev=243.24 00:33:52.070 clat percentiles (usec): 00:33:52.070 | 50.000th=[ 420], 99.000th=[ 1074], 99.900th=[ 1713], 99.990th=[ 1827], 00:33:52.070 | 99.999th=[ 1893] 00:33:52.070 bw ( KiB/s): min=53552, max=121824, per=98.20%, avg=82588.63, stdev=4393.99, samples=76 00:33:52.070 iops : min=13388, max=30456, avg=20647.16, stdev=1098.50, samples=76 00:33:52.070 lat (usec) : 20=0.01%, 50=0.01%, 100=2.83%, 250=23.99%, 500=39.93% 00:33:52.070 lat (usec) : 750=25.32%, 1000=6.80% 00:33:52.070 lat (msec) : 2=1.12%, 4=0.01% 00:33:52.070 cpu : usr=99.61%, sys=0.01%, ctx=75, majf=0, minf=254 00:33:52.070 IO depths : 1=9.4%, 2=25.8%, 4=51.6%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:52.070 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:52.070 complete : 0=0.0%, 4=88.6%, 8=11.4%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:52.070 issued rwts: total=192024,204904,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:52.070 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:52.070 00:33:52.070 Run status group 0 (all jobs): 00:33:52.070 READ: bw=75.0MiB/s (78.6MB/s), 75.0MiB/s-75.0MiB/s (78.6MB/s-78.6MB/s), io=750MiB (787MB), run=10001-10001msec 00:33:52.070 WRITE: bw=82.1MiB/s (86.1MB/s), 82.1MiB/s-82.1MiB/s (86.1MB/s-86.1MB/s), io=800MiB (839MB), run=9746-9746msec 00:33:52.070 00:33:52.070 real 0m13.537s 00:33:52.070 user 0m45.713s 00:33:52.070 sys 0m0.541s 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:52.070 ************************************ 00:33:52.070 END TEST bdev_fio_rw_verify 00:33:52.070 ************************************ 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:52.070 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ab4f4e5c-6a8a-5db9-806a-cc7e61433b3f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ab4f4e5c-6a8a-5db9-806a-cc7e61433b3f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "9ba5e599-eb35-5e52-86e9-d17d684e0aff"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9ba5e599-eb35-5e52-86e9-d17d684e0aff",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "585a4c3f-aca0-570e-ae8a-49c47aa1b6cb"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "585a4c3f-aca0-570e-ae8a-49c47aa1b6cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "d1645500-2f0d-58fb-b05b-031748e9450d"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d1645500-2f0d-58fb-b05b-031748e9450d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:33:52.071 crypto_ram2 00:33:52.071 crypto_ram3 00:33:52.071 crypto_ram4 ]] 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ab4f4e5c-6a8a-5db9-806a-cc7e61433b3f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ab4f4e5c-6a8a-5db9-806a-cc7e61433b3f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "9ba5e599-eb35-5e52-86e9-d17d684e0aff"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9ba5e599-eb35-5e52-86e9-d17d684e0aff",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "585a4c3f-aca0-570e-ae8a-49c47aa1b6cb"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "585a4c3f-aca0-570e-ae8a-49c47aa1b6cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "d1645500-2f0d-58fb-b05b-031748e9450d"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d1645500-2f0d-58fb-b05b-031748e9450d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram4]' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram4 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:52.071 ************************************ 00:33:52.071 START TEST bdev_fio_trim 00:33:52.071 ************************************ 00:33:52.071 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:52.072 06:02:05 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:52.072 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:52.072 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:52.072 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:52.072 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:52.072 fio-3.35 00:33:52.072 Starting 4 threads 00:34:04.308 00:34:04.308 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1311930: Fri Jul 26 06:02:18 2024 00:34:04.308 write: IOPS=32.7k, BW=128MiB/s (134MB/s)(1279MiB/10001msec); 0 zone resets 00:34:04.308 slat (usec): min=17, max=1432, avg=69.34, stdev=34.98 00:34:04.308 clat (usec): min=36, max=2076, avg=313.45, stdev=177.69 00:34:04.308 lat (usec): min=57, max=2211, avg=382.79, stdev=196.70 00:34:04.308 clat percentiles (usec): 00:34:04.308 | 50.000th=[ 273], 99.000th=[ 889], 99.900th=[ 1012], 99.990th=[ 1172], 00:34:04.308 | 99.999th=[ 1713] 00:34:04.308 bw ( KiB/s): min=115344, max=175727, per=100.00%, avg=131644.58, stdev=3562.35, samples=76 00:34:04.308 iops : min=28836, max=43931, avg=32911.11, stdev=890.55, samples=76 00:34:04.308 trim: IOPS=32.7k, BW=128MiB/s (134MB/s)(1279MiB/10001msec); 0 zone resets 00:34:04.308 slat (usec): min=4, max=401, avg=18.91, stdev= 7.66 00:34:04.308 clat (usec): min=15, max=1916, avg=295.59, stdev=132.77 00:34:04.308 lat (usec): min=29, max=1951, avg=314.50, stdev=135.48 00:34:04.308 clat percentiles (usec): 00:34:04.308 | 50.000th=[ 277], 99.000th=[ 635], 99.900th=[ 709], 99.990th=[ 832], 00:34:04.308 | 99.999th=[ 1123] 00:34:04.308 bw ( KiB/s): min=115352, max=175743, per=100.00%, avg=131645.84, stdev=3563.35, samples=76 00:34:04.308 iops : min=28838, max=43935, avg=32911.42, stdev=890.80, samples=76 00:34:04.308 lat (usec) : 20=0.01%, 50=0.02%, 100=4.25%, 250=38.99%, 500=45.69% 00:34:04.308 lat (usec) : 750=9.40%, 1000=1.57% 00:34:04.308 lat (msec) : 2=0.07%, 4=0.01% 00:34:04.308 cpu : usr=99.52%, sys=0.01%, ctx=72, majf=0, minf=110 00:34:04.308 IO depths : 1=7.3%, 2=26.5%, 4=53.0%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:04.308 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:04.308 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:04.308 issued rwts: total=0,327337,327339,0 short=0,0,0,0 dropped=0,0,0,0 00:34:04.308 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:04.308 00:34:04.308 Run status group 0 (all jobs): 00:34:04.308 WRITE: bw=128MiB/s (134MB/s), 128MiB/s-128MiB/s (134MB/s-134MB/s), io=1279MiB (1341MB), run=10001-10001msec 00:34:04.308 TRIM: bw=128MiB/s (134MB/s), 128MiB/s-128MiB/s (134MB/s-134MB/s), io=1279MiB (1341MB), run=10001-10001msec 00:34:04.308 00:34:04.308 real 0m13.552s 00:34:04.308 user 0m45.720s 00:34:04.308 sys 0m0.525s 00:34:04.308 06:02:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:04.308 06:02:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:04.308 ************************************ 00:34:04.308 END TEST bdev_fio_trim 00:34:04.308 ************************************ 00:34:04.308 06:02:19 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:04.308 06:02:19 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:34:04.308 06:02:19 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:04.308 06:02:19 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:34:04.308 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:04.308 06:02:19 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:34:04.308 00:34:04.308 real 0m27.436s 00:34:04.308 user 1m31.610s 00:34:04.308 sys 0m1.258s 00:34:04.308 06:02:19 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:04.308 06:02:19 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:04.308 ************************************ 00:34:04.308 END TEST bdev_fio 00:34:04.308 ************************************ 00:34:04.308 06:02:19 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:04.308 06:02:19 blockdev_crypto_aesni -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:04.308 06:02:19 blockdev_crypto_aesni -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:04.308 06:02:19 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:04.308 06:02:19 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:04.308 06:02:19 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:04.308 ************************************ 00:34:04.308 START TEST bdev_verify 00:34:04.308 ************************************ 00:34:04.308 06:02:19 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:04.308 [2024-07-26 06:02:19.181215] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:04.308 [2024-07-26 06:02:19.181279] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1313329 ] 00:34:04.567 [2024-07-26 06:02:19.309159] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:04.567 [2024-07-26 06:02:19.416933] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:04.567 [2024-07-26 06:02:19.416939] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:04.567 [2024-07-26 06:02:19.438375] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:04.567 [2024-07-26 06:02:19.446404] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:04.568 [2024-07-26 06:02:19.454424] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:04.827 [2024-07-26 06:02:19.556211] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:07.362 [2024-07-26 06:02:21.775531] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:07.362 [2024-07-26 06:02:21.775617] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:07.362 [2024-07-26 06:02:21.775633] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.362 [2024-07-26 06:02:21.783550] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:07.362 [2024-07-26 06:02:21.783570] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:07.362 [2024-07-26 06:02:21.783582] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.362 [2024-07-26 06:02:21.791573] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:07.362 [2024-07-26 06:02:21.791591] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:07.362 [2024-07-26 06:02:21.791602] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.362 [2024-07-26 06:02:21.799593] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:07.362 [2024-07-26 06:02:21.799610] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:07.362 [2024-07-26 06:02:21.799621] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.362 Running I/O for 5 seconds... 00:34:12.632 00:34:12.632 Latency(us) 00:34:12.632 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:12.632 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:12.632 Verification LBA range: start 0x0 length 0x1000 00:34:12.632 crypto_ram : 5.08 494.53 1.93 0.00 0.00 257311.47 4245.59 175978.41 00:34:12.632 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:12.632 Verification LBA range: start 0x1000 length 0x1000 00:34:12.632 crypto_ram : 5.08 498.88 1.95 0.00 0.00 255271.43 4616.01 175066.60 00:34:12.632 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:12.632 Verification LBA range: start 0x0 length 0x1000 00:34:12.632 crypto_ram2 : 5.08 499.02 1.95 0.00 0.00 254646.08 4160.11 165948.55 00:34:12.632 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:12.632 Verification LBA range: start 0x1000 length 0x1000 00:34:12.632 crypto_ram2 : 5.08 503.59 1.97 0.00 0.00 252691.17 4587.52 165036.74 00:34:12.632 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:12.632 Verification LBA range: start 0x0 length 0x1000 00:34:12.633 crypto_ram3 : 5.05 3849.27 15.04 0.00 0.00 32880.90 8377.21 27582.11 00:34:12.633 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:12.633 Verification LBA range: start 0x1000 length 0x1000 00:34:12.633 crypto_ram3 : 5.06 3870.47 15.12 0.00 0.00 32707.18 10143.83 27354.16 00:34:12.633 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:12.633 Verification LBA range: start 0x0 length 0x1000 00:34:12.633 crypto_ram4 : 5.07 3865.38 15.10 0.00 0.00 32681.42 3960.65 26214.40 00:34:12.633 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:12.633 Verification LBA range: start 0x1000 length 0x1000 00:34:12.633 crypto_ram4 : 5.07 3885.95 15.18 0.00 0.00 32507.89 3618.73 25758.50 00:34:12.633 =================================================================================================================== 00:34:12.633 Total : 17467.09 68.23 0.00 0.00 58172.36 3618.73 175978.41 00:34:12.633 00:34:12.633 real 0m8.300s 00:34:12.633 user 0m15.726s 00:34:12.633 sys 0m0.377s 00:34:12.633 06:02:27 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:12.633 06:02:27 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:12.633 ************************************ 00:34:12.633 END TEST bdev_verify 00:34:12.633 ************************************ 00:34:12.633 06:02:27 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:12.633 06:02:27 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:12.633 06:02:27 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:12.633 06:02:27 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:12.633 06:02:27 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:12.633 ************************************ 00:34:12.633 START TEST bdev_verify_big_io 00:34:12.633 ************************************ 00:34:12.633 06:02:27 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:12.932 [2024-07-26 06:02:27.564629] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:12.932 [2024-07-26 06:02:27.564691] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1314386 ] 00:34:12.932 [2024-07-26 06:02:27.676528] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:12.932 [2024-07-26 06:02:27.783492] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:12.932 [2024-07-26 06:02:27.783495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:13.192 [2024-07-26 06:02:27.804954] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:13.192 [2024-07-26 06:02:27.812984] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:13.192 [2024-07-26 06:02:27.821006] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:13.192 [2024-07-26 06:02:27.933865] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:15.721 [2024-07-26 06:02:30.164180] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:15.721 [2024-07-26 06:02:30.164264] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:15.721 [2024-07-26 06:02:30.164279] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.721 [2024-07-26 06:02:30.172193] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:15.721 [2024-07-26 06:02:30.172212] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:15.721 [2024-07-26 06:02:30.172224] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.721 [2024-07-26 06:02:30.180213] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:15.721 [2024-07-26 06:02:30.180231] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:15.721 [2024-07-26 06:02:30.180242] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.721 [2024-07-26 06:02:30.188238] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:15.721 [2024-07-26 06:02:30.188255] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:15.721 [2024-07-26 06:02:30.188267] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:15.721 Running I/O for 5 seconds... 00:34:18.258 [2024-07-26 06:02:32.888978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.889452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.889850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.890548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.893240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.894023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.895336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.896911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.898562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.898980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.899368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.900787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.902821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.904403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.905795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.907371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.908226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.908624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.909240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.910549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.912295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.913628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.915193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.916768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.917503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.917908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.919317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.920626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.923234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.924632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.926202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.927778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.928620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.929246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.930577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.932140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.934689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.936266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.937834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.938962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.939849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.941290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.942592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.944163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.946673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.948238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.949840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.950234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.951090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.952541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.954142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.955717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.958528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.960106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.961455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.961847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.963418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.964745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.966312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.967881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.970769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.972332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.972733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.973120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.974913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.976499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.978074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.979382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.982079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.983282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.983677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.984062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.985699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.987467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.988515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.989899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.991950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.258 [2024-07-26 06:02:32.992358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:32.992751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:32.994411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:32.996316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:32.997951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:32.998523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:32.999836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.001291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.001695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.002652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.003970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.005915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.006948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.008323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.009642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.011132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.011531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.013250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.014861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.016825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.017473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.018789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.020352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.021964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.023094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.024415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.025992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.027290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.028821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.030162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.031730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.033763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.035285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.036984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.038702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.040628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.042227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.043856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.044243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.047130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.047530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.049031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.050547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.051398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.052048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.053520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.054755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.056291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.056701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.056755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.057741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.059184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.060161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.060212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.061650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.063171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.064569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.064620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.066170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.066708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.068472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.068530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.070110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.071472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.072658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.072710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.073760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.074164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.075385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.075437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.076117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.077688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.079084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.079134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.259 [2024-07-26 06:02:33.079524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.079939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.081698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.081758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.082140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.083384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.084704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.084755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.085867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.086352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.087303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.087355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.087756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.089929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.089988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.091693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.091746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.092568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.092625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.093017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.093060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.094721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.094780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.095972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.096020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.096767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.096826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.097211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.097254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.099615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.099679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.100878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.100927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.101763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.101820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.102318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.102367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.105260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.105318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.107053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.107109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.107966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.108022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.109091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.109137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.111542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.111600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.112593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.112646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.113617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.113679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.115340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.115383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.117977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.118034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.118426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.118473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.119619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.119683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.120885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.120936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.123735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.123794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.124180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.124221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.125505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.125562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.126996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.127046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.129096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.129154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.129538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.129581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.130428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.130492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.130890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.130941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.132852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.132911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.133310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.133354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.260 [2024-07-26 06:02:33.134293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.134351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.134752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.134816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.136844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.136905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.137290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.137332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.137353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.137814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.138311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.138367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.138763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.138813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.138834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.139310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.140532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.140945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.141000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.141391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.141850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.142008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.142407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.142455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.142863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.143299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.144415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.144468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.144509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.144550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.144946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.145115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.145163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.145204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.145246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.145567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.146670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.146724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.146780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.146836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.147190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.147345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.147392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.147434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.147475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.147944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.148931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.148990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.149031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.149072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.149533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.149696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.149745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.149786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.149827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.150100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.151298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.151350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.151391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.151432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.151778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.151932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.151978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.152021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.152061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.152450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.153491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.153544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.153587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.153629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.154040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.154203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.154263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.154319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.154362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.261 [2024-07-26 06:02:33.154714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.155927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.155983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.156026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.156067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.156395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.156554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.156600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.156649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.156690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.157061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.158130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.158182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.158225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.158278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.158627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.158791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.158851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.158902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.158958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.159329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.160579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.160667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.160724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.160777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.161169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.161330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.161389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.161431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.161472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.161875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.162780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.162832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.162877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.262 [2024-07-26 06:02:33.162918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.163177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.163333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.163379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.163420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.163469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.163831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.164714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.164773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.164817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.164859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.165248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.165403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.165451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.165493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.165549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.165999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.166933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.166986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.167027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.167075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.167439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.167604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.167660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.167702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.167743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.168002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.168964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.169017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.169075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.169117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.169514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.169681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.169735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.169776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.169816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.170078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.321077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.322359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.323667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.324991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.327369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.328702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.330263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.331057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.331993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.333390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.334927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.336614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.339326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.340854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.523 [2024-07-26 06:02:33.342335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.342729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.344502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.345816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.347148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.348717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.351259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.352844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.353553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.353956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.355764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.357276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.358916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.360464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.363176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.364625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.365022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.365410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.367048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.368379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.369951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.370764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.373531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.374150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.374539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.375282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.377106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.378685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.380131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.381281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.383708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.384109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.384508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.386194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.387922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.389506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.390123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.391487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.393090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.393488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.394409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.395732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.397702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.398941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.400330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.401647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.403150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.403546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.403597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.405066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.406848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.407441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.407489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.408798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.410030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.410427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.410472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.411384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.411837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.412966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.413015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.414312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.415618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.416025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.416081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.417812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.418223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.418621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.418678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.420062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.421357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.422041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.422092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.423294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.423711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.424573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.424622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.425798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.427058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.428105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.428155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.429264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.524 [2024-07-26 06:02:33.429709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.430899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.430950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.432146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.433883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.435508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.435561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.436863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.437418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.438937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.439008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.440707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.442139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.785 [2024-07-26 06:02:33.443260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.443313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.444735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.445180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.446144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.446196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.447260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.448691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.449708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.449759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.450414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.450832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.452223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.452274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.452671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.453928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.455688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.455745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.456699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.457155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.458525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.458586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.458979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.460342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.461338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.461391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.463081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.463569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.464204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.464256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.464647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.465878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.466560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.466610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.467654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.468071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.468473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.468520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.468913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.470128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.471561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.471611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.472685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.473183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.473579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.473625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.474283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.475530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.476923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.476981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.478677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.479266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.479675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.479723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.481086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.482352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.483328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.483380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.484315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.484957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.485358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.485407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.486759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.487945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.489352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.489402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.489799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.490286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.490962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.491013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.492218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.493462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.495091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.495149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.495535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.496005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.496407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.496458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.496853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.498246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.498662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.498728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.499121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.499586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.499996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.500049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.500437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.501921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.502332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.502381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.502787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.503321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.503732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.503799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.504188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.506056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.506456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.506518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.506919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.507519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.507929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.507988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.508387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.510156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.510559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.510607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.511008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.511654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.512053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.512108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.512515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.513999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.514399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.514448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.514847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.515452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.515857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.515916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.516300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.517713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.518114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.518163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.518550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.519073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.519469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.519514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.519913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.521176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.521580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.521629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.522022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.522047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.522445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.522595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.522998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.523044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.523431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.523454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.523780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.525110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.525169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.525558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.525605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.525963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:18.786 [2024-07-26 06:02:33.526471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:18.786 [2024-07-26 06:02:33.526541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:18.786 [2024-07-26 06:02:33.528090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:18.786 [2024-07-26 06:02:33.528161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:34:18.786 [2024-07-26 06:02:33.529412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.529470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.529526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.529571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.529842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.530005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.530052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.530094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.530136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.531630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.786 [2024-07-26 06:02:33.531713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.531756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.531810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.532076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.532232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.532279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.532320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.532361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.533540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.533593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.533634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.533699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.534159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.534309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.534354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.534395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.534436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.535568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.535622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.535672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.535714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.535976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.536126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.536171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.536220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.536262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.537912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.537969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.538011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.538052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.538317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.538478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.538528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.538569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.538618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.539817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.539875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.539917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.539971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.540235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.540395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.540441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.540483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.540525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.541975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.542035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.542077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.542133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.542398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.542552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.542600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.542652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.542694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.543865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.543918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.543960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.544017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.544458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.544610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.544665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.544708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.544749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.545910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.546478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.546526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.547971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.548238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.548394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.550007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.550062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.550554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.552182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.553658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.553709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.554101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.554371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.554548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.556145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.556193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.556578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.557825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.559028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.559077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.560089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.560354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.560508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.561295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.561344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.561736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.562993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.563394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.563444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.565061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.565331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.565487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.566663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.566712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.567248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.568448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.569901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.569953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.570737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.571047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.571201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.572697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.572748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.573534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.574790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.575366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.575414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.576860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.577127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.577281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.578255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.578306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.579741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.581082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.582561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.582607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.583001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.583267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.583424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.584974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.585027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.585717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.586970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.587632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.587690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.589350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.589845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.590003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.591449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.591499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.593106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.594433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.595124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.595173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.596409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.596791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.596950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.597808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.597858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.598701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.599901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.601351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.601404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.602398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.602673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.602831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.603285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.603331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.604598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.607900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.609666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.609725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.787 [2024-07-26 06:02:33.610726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.610998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.611151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.612528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.612576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.612973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.614144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.615822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.615868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.617551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.617960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.618116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.618776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.618826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.620515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.621852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.622583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.622632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.624201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.624517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.624680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.626095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.626143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.627064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.630310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.631801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.631850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.633151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.633492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.633657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.635064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.635112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.636677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.640917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.642226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.642276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.643606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.643884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.644039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.645060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.645109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.646762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.650197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.651111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.651160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.651935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.652207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.652363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.653681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.653730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.655058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.658663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.660256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.660306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.661375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.661692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.661848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.663336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.663384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.663798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.667862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.669495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.669544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.671128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.671396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.671563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.673078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.673125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.674659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.678567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.679925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.679975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.681545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.681984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.682141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.683468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.683517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.684847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.688739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.689378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.689425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:18.788 [2024-07-26 06:02:33.690630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.690943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.691096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.692287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.692336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.693893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.697878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.698547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.698593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.698993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.699293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.699449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.700772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.700820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.702173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.705939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.707593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.707650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.708065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.708392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.708547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.709579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.709626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.710917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.714665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.715997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.716044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.716085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.716356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.716508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.048 [2024-07-26 06:02:33.717427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.717476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.717517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.720556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.721917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.722073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.722219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.723594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.727264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.728122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.729457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.730225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.730496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.730553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.731002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.732429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.734044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.739981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.741692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.742817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.743583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.743864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.744580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.745869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.747183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.748524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.753796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.754686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.756137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.756819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.757092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.757745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.759059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.760481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.762047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.764655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.766234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.767655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.769100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.769512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.771225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.771627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.773231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.774667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.777427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.778995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.780448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.782017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.782383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.783536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.784672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.785426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.786525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.792008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.793352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.794930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.795962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.796235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.796740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.798208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.798597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.800106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.804980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.806657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.808395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.809523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.809845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.811209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.811807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.813059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.814036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.818602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.820050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.820508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.821891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.822248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.823406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.824873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.825858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.826779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.829700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.049 [2024-07-26 06:02:33.830723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.832144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.832191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.832562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.833532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.835099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.835704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.835754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.840114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.840171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.841668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.841716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.842071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.843163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.843219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.844579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.844625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.849657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.849715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.850671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.850723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.850996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.851905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.851961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.853235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.853279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.857832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.857906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.859403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.859459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.859737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.860907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.860964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.861552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.861596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.865443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.865502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.866716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.866766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.867089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.868854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.868911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.869631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.869686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.874080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.874145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.875727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.875784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.876168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.876679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.876733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.877815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.877866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.882991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.883051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.883446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.883492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.883770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.884268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.884323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.885953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.885998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.890366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.890425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.891415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.891461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.891845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.892765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.892822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.894028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.894077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.898102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.898175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.899819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.899864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.900253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.901975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.902045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.903670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.903726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.908035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.908099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.908947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.050 [2024-07-26 06:02:33.908993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.909337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.910669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.910725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.911811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.911855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.917340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.917405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.917803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.917857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.918131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.919595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.919655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.920042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.920091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.925357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.925414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.926068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.926117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.926465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.927241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.927296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.928276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.928325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.930118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.930176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.931684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.931729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.932094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.933075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.933133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.934701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.934744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.937566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.937627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.939088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.939144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.939536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.941205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.941268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.941789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.941836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.944361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.944423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.945911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.945956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.946303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.947587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.947646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.948032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.948089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.950599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.950668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.951064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.951118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.951587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.953208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.953262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.954011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.051 [2024-07-26 06:02:33.954058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.956453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.956513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.957943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.958001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.958338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.958840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.958897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.959423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.959470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.962040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.962105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.963777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.963821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.964215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.965676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.965731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.966119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.966166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.968478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.968543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.968941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.969005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.969439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.971296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.971356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.971922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.971968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.974384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.974449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.976118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.976168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.976554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.977058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.977129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.977517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.977565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.979927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.979986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.981541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.981588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.982047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.983795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.983851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.984240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.984297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.986735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.986793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.987186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.987233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.987624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.989319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.989382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.989873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.989918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.992372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.992429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.994126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.994172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.994600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.995803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.995859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.996406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.996455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:33.999142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:34.000758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:34.002108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.313 [2024-07-26 06:02:34.002155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.002519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.004308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.005847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.006717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.006765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.010927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.010991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.011033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.011074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.011391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.012725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.012781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.012822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.012863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.017829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.017900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.017945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.017987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.018260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.018415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.018461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.018501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.018542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.022105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.022171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.022212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.022253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.022584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.022747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.022792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.022834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.022874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.026621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.026680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.026721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.026762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.027095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.027247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.027291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.027343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.027386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.030387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.030439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.030480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.030524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.030847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.031004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.031065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.031106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.031166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.033449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.033502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.033543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.033584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.033860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.034015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.034062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.034105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.034152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.037124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.037179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.037220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.037261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.037533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.037692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.037740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.037781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.037822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.040430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.040483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.040540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.040581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.040856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.041011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.041058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.041104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.041145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.044423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.044476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.044517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.046202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.046578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.046735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.314 [2024-07-26 06:02:34.046782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.046823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.047534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.051834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.052839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.052887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.053990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.054358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.054512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.055616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.055673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.057288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.060073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.060473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.060522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.062004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.062313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.062467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.063171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.063223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.064196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.067027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.068155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.068203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.069179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.069450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.069607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.071171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.071227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.072423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.077352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.078994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.079043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.080561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.081008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.081166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.082377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.082434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.083957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.087260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.088235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.088287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.089373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.089653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.089808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.090838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.090889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.091578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.093479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.094946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.095010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.095706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.095985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.096140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.097692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.097743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.099073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.102722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.103943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.103994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.104818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.105095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.105247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.106979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.107029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.108694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.112495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.113493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.113544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.114791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.115067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.115223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.116473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.116523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.117498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.119864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.121463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.121520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.122221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.122509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.122664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.123812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.123862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.315 [2024-07-26 06:02:34.125565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.130128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.131536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.131586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.133148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.133522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.133683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.134906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.134954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.136358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.139550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.139967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.140017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.141626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.141904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.142057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.143561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.143609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.145337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.149506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.150817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.150865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.151270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.151551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.151718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.152114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.152160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.153580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.156773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.158127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.158176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.159749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.160025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.160176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.161918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.161964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.162523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.165603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.167318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.167373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.168393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.168719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.168876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.170228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.170277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.171824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.177475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.178993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.179043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.180517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.180796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.180946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.182349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.182396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.183791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.187497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.187971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.188019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.189259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.189598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.189756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.191085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.191133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.192474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.196313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.197900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.197949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.198733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.199002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.199156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.199991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.200039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.200889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.204978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.206263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.206311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.207614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.207941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.208094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.209674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.209723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.210583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.316 [2024-07-26 06:02:34.213053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.317 [2024-07-26 06:02:34.214373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.317 [2024-07-26 06:02:34.214422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.317 [2024-07-26 06:02:34.215874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.317 [2024-07-26 06:02:34.216149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.317 [2024-07-26 06:02:34.216301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.317 [2024-07-26 06:02:34.217557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.317 [2024-07-26 06:02:34.217605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.218848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.223039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.224695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.224743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.225197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.225468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.225624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.227155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.227203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.228804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.232160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.233530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.233579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.235234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.235714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.235866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.237360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.237409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.238038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.242573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.243897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.243946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.245273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.245545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.245705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.246681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.246732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.248203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.252998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.253073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.253120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.578 [2024-07-26 06:02:34.254689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.254961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.255109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.255159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.255200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.256906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.260273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.261023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.262155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.262914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.263208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.263361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.264707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.266269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.267451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.272997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.273409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.275070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.275459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.275738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.277396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.278872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.280444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.281115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.286603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.287737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.288511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.289646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.289983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.291435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.293017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.293982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.295610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.299444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.300949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.301370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.302841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.303115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.304687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.305337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.306650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.308055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.311126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.312554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.314215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.314793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.315066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.316834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.318008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.318728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.320104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.324903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.326272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.327657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.328176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.328450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.328984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.330124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.331328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.332364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.337649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.338054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.339250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.340459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.340790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.342065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.343282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.344223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.345887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.351794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.353368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.354426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.355250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.355526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.579 [2024-07-26 06:02:34.356030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.357521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.357924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.359411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.363984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.364469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.365877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.365925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.366247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.367601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.369325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.370054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.370103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.375152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.375223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.376436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.376481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.376840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.378008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.378065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.379719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.379773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.382535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.382605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.383987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.384041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.384323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.385127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.385184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.386170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.386227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.390759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.390817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.391786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.391835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.392173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.393866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.393928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.395242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.395290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.401429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.401496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.403123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.403176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.403539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.404798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.404855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.406025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.406076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.409130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.409189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.409621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.409676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.409948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.411625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.411693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.412699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.412746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.417403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.417461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.418775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.418826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.419178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.420253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.420317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.421858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.421906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.427030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.427090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.428481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.428535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.428820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.580 [2024-07-26 06:02:34.429759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.429815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.430662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.430707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.435586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.435658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.436655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.436702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.437065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.438489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.438546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.438945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.438989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.443053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.443111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.444675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.444720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.445157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.446537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.446596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.447655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.447702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.452694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.452757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.454172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.454217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.454491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.455153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.455209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.456938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.456984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.461556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.461616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.462016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.462081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.462471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.462980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.463036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.464758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.464803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.469774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.469833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.470224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.470271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.470542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.471049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.471110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.471505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.471560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.476882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.476949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.477339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.477388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.477901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.478412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.478466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.478869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.478919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.481428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.481487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.481884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.481934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.482271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.482777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.482839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.483233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.581 [2024-07-26 06:02:34.483281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.485878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.485938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.486328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.486377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.486767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.487263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.487317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.487715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.487768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.490459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.490518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.490918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.490983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.491477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.491986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.492044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.492436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.492484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.494985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.495044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.495428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.495476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.495762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.496942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.496998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.498392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.498441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.501778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.501837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.502793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.502842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.503117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.504561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.504617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.505692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.505741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.510223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.510282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.511967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.512010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.512337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.512969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.513026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.513417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.513468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.517868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.517927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.518319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.518367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.518672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.519655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.519713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.520567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.520615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.524887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.524948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.525876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.525924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.526282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.844 [2024-07-26 06:02:34.527483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.527548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.528982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.529029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.532820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.532878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.533348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.533412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.533689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.535361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.535423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.536511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.536557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.541143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.541956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.543182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.543229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.543506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.545291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.545712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.547275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.547322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.552214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.552272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.552314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.552355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.552767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.553268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.553333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.553375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.553416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.557026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.557081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.557125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.557167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.557622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.557779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.557825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.557867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.557908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.559142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.559195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.559237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.559278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.559597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.559769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.559821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.559868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.559909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.561238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.561291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.561334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.561375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.561697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.561852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.561897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.561944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.561987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.563340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.563395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.563435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.563481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.563781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.563940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.563986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.564027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.564072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.565404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.565466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.565506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.565547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.565825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.565979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.566024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.566078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.566120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.567472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.567529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.567573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.845 [2024-07-26 06:02:34.567614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.568116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.568268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.568314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.568367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.568410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.569744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.569795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.569836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.569877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.570190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.570344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.570390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.570431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.570472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.572005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.572057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.572098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.573422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.573703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.573855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.573901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.573943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.575296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.576928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.577625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.577678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.578990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.579280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.579436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.580705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.580751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.582303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.583699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.584096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.584140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.584527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.584808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.584964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.585361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.585418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.587030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.588375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.589231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.589280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.590307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.590744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.590903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.591299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.591355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.593098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.594386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.596061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.596144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.596535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.596933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.597092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.598655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.598710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.600294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.601635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.602970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.603018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.604348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.604624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.604782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.605933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.605981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.606364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.607689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.609354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.609403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.611016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.611292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.611446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.612769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.612816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.846 [2024-07-26 06:02:34.614149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.615558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.616403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.616452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.617763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.618110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.618261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.619605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.619658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.620644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.621913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.623407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.623463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.624683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.625078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.625231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.626795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.626849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.627237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.628524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.629878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.629927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.630779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.631112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.631262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.632614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.632667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.634005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.638833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.640488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.640535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.642161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.642435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.642591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.644299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.644353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.645632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.649880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.650280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.650327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.651753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.652025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.652176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.653757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.653808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.655310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.656517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.657869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.657916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.658831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.659259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.659415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.660200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.660247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.661547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.662732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.664104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.664152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.665504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.665850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.666002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.667345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.667394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.667843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.669060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.670408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.670457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.671797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.672072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.672229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.673924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.673969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.675612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.676872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.677271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.677317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.678540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.678865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.679018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.680358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.680407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.681737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.847 [2024-07-26 06:02:34.682930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.684605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.684678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.686401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.686824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.686979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.687373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.687418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.689070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.690373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.691228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.691279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.692586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.692866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.693019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.694369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.694419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.695710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.697258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.698718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.698766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.700248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.700575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.700737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.701649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.701699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.702998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.704199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.704599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.704667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.705049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.705327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.705481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.707120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.707182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.708746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.709943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.711293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.711345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.712688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.712963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.713120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.713512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.713556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.714188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.715342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.716977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.717045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.718248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.718574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.718733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.720071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.720122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.721462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.723088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.724408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.724460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.725991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.726308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.726458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.727431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.727482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.728666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.730001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.730061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.730109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.731361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.731748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.731904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.731956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.731998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.733079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.734388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.735354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.736339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.737748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.738062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.738212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.739206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.740372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.741953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.746974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.848 [2024-07-26 06:02:34.747892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:19.849 [2024-07-26 06:02:34.748838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.750399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.750732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.751231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.752171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.753141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.754572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.758532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.758942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.759607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.760825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.761107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.761862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.763051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.764699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.766038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.767820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.769108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.770811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.771518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.771805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.773607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.774018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.774410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.775366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.777785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.779416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.779818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.780207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.780535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.781629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.782922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.784131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.785109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.787660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.788671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.789788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.791248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.791633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.792723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.793116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.793506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.795262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.797479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.798304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.798698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.799088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.799363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.800780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.801318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.802951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.804270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.807263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.808649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.809156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.810682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.811023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.811654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.812050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.812657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.813923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.816577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.816991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.817382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.817429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.817766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.818853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.819703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.820717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.820769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.822818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.822878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.824136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.824180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.824455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.825348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.825405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.826265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.826311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.829047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.829114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.830694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.111 [2024-07-26 06:02:34.830749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.831089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.832511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.832568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.833906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.833954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.836150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.836211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.837771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.837818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.838187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.838998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.839056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.840608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.840661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.844294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.844392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.844792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.844840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.845114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.845611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.845676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.846065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.846108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.847877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.847935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.848325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.848374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.848749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.849248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.849304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.849697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.849740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.852430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.852491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.852894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.852945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.853385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.853894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.853951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.854349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.854403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.856743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.856803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.857193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.857241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.857707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.858205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.858267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.858661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.858713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.860717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.860777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.861167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.861213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.861670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.862178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.862232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.862625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.862679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.865247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.865307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.865702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.865746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.866104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.866603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.866674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.867057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.867102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.868909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.868968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.869361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.869405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.869789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.871193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.871250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.871976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.872026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.874764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.874824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.875972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.876021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.876373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.878106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.878181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.879738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.879792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.882518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.882578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.884200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.112 [2024-07-26 06:02:34.884251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.884552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.885625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.885691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.886839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.886890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.889316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.889377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.890160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.890230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.890504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.891959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.892023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.892415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.892463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.895345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.895412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.896455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.896503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.896902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.898204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.898263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.898672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.898718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.900938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.901005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.902589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.902652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.902928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.903468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.903522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.903919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.903965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.905729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.905788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.907006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.907055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.907347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.907860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.907917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.908306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.908353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.911151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.911217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.912758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.912815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.913192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.913702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.913759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.914716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.914764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.917196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.917255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.918253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.918299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.918751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.919248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.919305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.920915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.920983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.923737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.923805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.924194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.924239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.924659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.925775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.925832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.927040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.927092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.929398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.929457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.929846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.929890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.930216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.931868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.931932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.933488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.933533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.935078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.935137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.935523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.935566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.935891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.937241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.937297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.937971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.938026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.939636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.939703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.940086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.940133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.113 [2024-07-26 06:02:34.940404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.942188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.942245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.943063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.943111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.944724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.945804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.947032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.947081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.947410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.948829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.950063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.950846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.950895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.953632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.953706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.953764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.953809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.954080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.955363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.955419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.955460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.955508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.956719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.956773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.956815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.956857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.957210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.957361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.957407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.957448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.957490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.958852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.958907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.958948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.958995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.959347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.959502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.959548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.959589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.959632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.960982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.961034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.961079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.961120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.961394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.961548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.961593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.961652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.961694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.962888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.962942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.962983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.963028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.963383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.963536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.963582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.963623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.963677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.964977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.965030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.965072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.965114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.965568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.965733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.965780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.965821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.965863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.967104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.967168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.967211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.967277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.967551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.967727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.967775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.967815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.967855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.969286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.969343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.969387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.969428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.969710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.969865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.969910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.969951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.969992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.971171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.971224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.971269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.972731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.973019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.973173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.114 [2024-07-26 06:02:34.973224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.973266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.974697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.975965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.977387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.977435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.978860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.979181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.979337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.980916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.980970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.981892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.983102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.984384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.984433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.984842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.985310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.985470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.986959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.987011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.988460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.989745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.991056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.991106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.992444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.992725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.992880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.993672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.993722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.994102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.995287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.996916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.996966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.998282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.998561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:34.998721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.000040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.000088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.001410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.002926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.003680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.003728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.005046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.005361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.005512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.007097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.007146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.007950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.009198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.010662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.010712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.011097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.011449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.011605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.012620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.012676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.013891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.115 [2024-07-26 06:02:35.015090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.016391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.016440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.017594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.017917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.018069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.019658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.019709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.020094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.021366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.022760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.022811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.024323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.024598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.024755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.026048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.026105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.027426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.028637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.029046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.029090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.029798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.030074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.030225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.031572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.031620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.033196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.377 [2024-07-26 06:02:35.034413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.035754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.035803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.037369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.037750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.037906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.038297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.038342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.039410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.040627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.041781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.041829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.043483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.043793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.043954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.045494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.045550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.047162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.048433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.049763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.049816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.051148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.051424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.051577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.052210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.052260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.053574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.054724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.055125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.055171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.055555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.055847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.056000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.057355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.057404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.058795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.060035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.061708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.061756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.063383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.063667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.063823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.064217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.064262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.064672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.065923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.067504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.067555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.068161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.068438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.068596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.069941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.069990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.071548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.072880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.074447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.074505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.075724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.076083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.076239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.077729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.077787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.079455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.080889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.081995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.082049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.083429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.083727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.083881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.084872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.084923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.085859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.087422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.088566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.088617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.089155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.089428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.089582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.091131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.091188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.091580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.092787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.094377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.094430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.095514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.095900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.096054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.097186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.097240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.097627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.099065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.378 [2024-07-26 06:02:35.100650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.100699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.101271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.101544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.101707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.103450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.103500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.104931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.106224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.107431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.107483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.108604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.108931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.109084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.110309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.110360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.111032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.112557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.113936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.113987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.114381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.114661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.114813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.116569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.116621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.117012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.118279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.118335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.118375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.119387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.119711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.119860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.119910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.119959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.121399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.122887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.124492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.126113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.126507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.126788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.126947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.128611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.129006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.129394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.131054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.132376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.133952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.134340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.134750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.135724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.136950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.138157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.139189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.140794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.141846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.143068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.144166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.144493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.145824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.146853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.147252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.147647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.149973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.151191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.152128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.152517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.152951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.154614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.155961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.156497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.158168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.160041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.161739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.162143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.163753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.164215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.164719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.165255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.166601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.168093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.170054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.170455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.170852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.172446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.172729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.174542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.175074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.176434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.177995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.179561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.181241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.181633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.379 [2024-07-26 06:02:35.183133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.183498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.184002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.184888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.186201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.187530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.190005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.191347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.192943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.194091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.194556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.195064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.196608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.197923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.198503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.200064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.200461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.200867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.200918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.201289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.201800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.202198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.202595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.202657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.204823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.204882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.205269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.205317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.205818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.206329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.206391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.206788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.206845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.208938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.208997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.209384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.209432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.209919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.210417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.210480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.210880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.210949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.212953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.213012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.213401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.213448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.213923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.214419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.214482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.214880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.214945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.217196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.217255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.217664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.217728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.218219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.218726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.218783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.219177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.219231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.221519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.221578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.221979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.222028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.222457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.222965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.223027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.224548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.224598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.226315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.226371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.226969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.227019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.227339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.228838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.228892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.229991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.230039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.231770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.231854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.233417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.233473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.233754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.234321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.234378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.235876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.235932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.237717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.237775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.238988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.239038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.380 [2024-07-26 06:02:35.239312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.240449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.240505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.241717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.241765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.245062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.245127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.246670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.246726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.247054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.248609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.248676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.250144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.250188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.253003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.253062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.253560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.253608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.253889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.254386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.254440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.254837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.254893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.257671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.257734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.259070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.259117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.259542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.260047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.260101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.261110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.261156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.263489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.263546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.264886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.264934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.265367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.265874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.265928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.267572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.267615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.269956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.270014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.270513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.270560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.270924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.271669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.271725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.272756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.272805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.275711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.275779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.276168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.276216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.276620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.277815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.277872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.278812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.278860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.281444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.281502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.281896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.281945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.381 [2024-07-26 06:02:35.282366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.283686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.283741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.284656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.284705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.287093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.287159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.287562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.287610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.288040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.289775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.289841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.291203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.291251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.293375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.293434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.293831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.293881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.294212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.295567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.295643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.297158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.297206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.298713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.298771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.299161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.299210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.299484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.300548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.300604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.301488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.301538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.303178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.303252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.303637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.303690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.303965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.305754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.305814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.307527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.307575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.643 [2024-07-26 06:02:35.310356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.310427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.310822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.310871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.311297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.312826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.312882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.314034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.314086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.316432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.316492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.316891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.316958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.317396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.318569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.318625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.320187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.320232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.322142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.322201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.322593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.322650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.322940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.324093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.324156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.325102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.325148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.326899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.328449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.328851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.328908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.329184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.329690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.330089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.330823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.330871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.332436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.332493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.332535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.332580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.332938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.334405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.334460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.334501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.334543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.335849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.335901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.335942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.335990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.336416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.336568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.336612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.336679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.336723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.338043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.338102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.338145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.338190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.338463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.338616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.338668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.338709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.338750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.339983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.340042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.340084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.340130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.340402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.340553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.340598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.340653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.340694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.342581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.342633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.342680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.342726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.343057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.343210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.343255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.343295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.343339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.344587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.344646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.344688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.344728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.345043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.644 [2024-07-26 06:02:35.345196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.345241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.345283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.345324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.346526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.346580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.346621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.346670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.346967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.347119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.347166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.347207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.347249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.348480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.348536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.348577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.348617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.348926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.349079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.349131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.349174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.349215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.350453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.350504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.350545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.351881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.352204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.352357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.352403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.352444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.352847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.354006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.355346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.355396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.356518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.356800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.356955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.358665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.358721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.360422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.361729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.362653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.362701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.364009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.364326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.364484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.365901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.365952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.366795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.367948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.369495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.369550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.369945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.370313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.370469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.371864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.371912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.373314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.374534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.376142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.376209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.377773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.378051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.378204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.379937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.379991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.380380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.381682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.383012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.383061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.384392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.384700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.384864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.386365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.386415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.387863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.389094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.389495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.389545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.390752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.391070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.645 [2024-07-26 06:02:35.391224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.392578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.392626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.393969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.395172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.396731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.396786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.398394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.398759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.398912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.399309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.399359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.400735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.401958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.403069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.403122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.404644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.404923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.405078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.406716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.406778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.408331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.409589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.410910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.410959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.412281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.412612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.412767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.413612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.413666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.414979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.416164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.416566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.416616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.417010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.417285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.417441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.418788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.418837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.420217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.421502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.422924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.422975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.424339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.424615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.424773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.425168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.425235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.425628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.426958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.428496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.428554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.429799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.430114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.430268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.430679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.430729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.431467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.432758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.433813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.433864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.435204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.435665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.435826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.436224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.436272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.437985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.439278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.440586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.440637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.441032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.441383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.441538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.442486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.442536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.443517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.444844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.446025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.446083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.446474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.446889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.447042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.448758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.448813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.450526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.451775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.452181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.452230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.646 [2024-07-26 06:02:35.452615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.452897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.453052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.454028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.454079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.455123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.456394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.456815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.456862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.457247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.457519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.457682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.459241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.459296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.460006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.461302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.461710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.461757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.462972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.463365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.463520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.464544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.464594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.466260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.467825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.468231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.468283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.469730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.470031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.470188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.470854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.470906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.471923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.473214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.474165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.474214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.475183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.475457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.475608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.476982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.477030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.478049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.479549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.481263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.481318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.482712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.483085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.483239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.484591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.484646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.486311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.487733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.488943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.488996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.490502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.490827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.490979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.491967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.492018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.493162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.494481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.494536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.494578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.495717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.496040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.496195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.496248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.496289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.497491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.498951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.499600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.500850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.502527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.502854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.503007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.504067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.505612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.506008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.508649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.509443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.510529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.512036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.512357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.512859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.647 [2024-07-26 06:02:35.513256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.514936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.515327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.517289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.518607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.519982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.521477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.521816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.523225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.524563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.525825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.526219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.529224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.529682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.531105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.532689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.533064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.533558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.534806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.536114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.537429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.539914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.541254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.542596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.543465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.543945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.544440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.545184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.546608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.648 [2024-07-26 06:02:35.547777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.549377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.549788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.550415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.551686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.552119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.553703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.554108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.554499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.554904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.556727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.557127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.557519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.557921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.558256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.558763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.559164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.559556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.559957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.561884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.562286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.562687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.563077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.563431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.563937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.564335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.564734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.565128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.567258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.567667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.568066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.568116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.568497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.569003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.569402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.569806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.569857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.571678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.571740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.572128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.572182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.572584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.573094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.573152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.573541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.573589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.575311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.575387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.576928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.576987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.577259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.577763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.577821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.578210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.578257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.580343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.580401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.581614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.581667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.582003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.582501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.582557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.582957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.583007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.585683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.585743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.587194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.587240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.587647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.588145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.910 [2024-07-26 06:02:35.588200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.588872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.588920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.591430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.591488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.592795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.592840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.593310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.593813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.593869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.595228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.595277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.597098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.597164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.597551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.597597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.597880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.599276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.599332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.599839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.599889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.601567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.601626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.602549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.602593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.602979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.604452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.604509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.605860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.605909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.607571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.607632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.609209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.609261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.609592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.610422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.610481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.611971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.612028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.613848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.613908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.615167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.615218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.615494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.616413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.616469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.617451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.617500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.619947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.620007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.620974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.621020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.621338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.623090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.623162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.624690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.624741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.627409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.627485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.629016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.629069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.629348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.630413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.630468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.631559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.631607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.634058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.634116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.634846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.634912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.635187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.636653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.636709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.637100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.637147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.640096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.640162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.641135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.641182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.641552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.642950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.911 [2024-07-26 06:02:35.643005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.643391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.643439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.645819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.645879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.647552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.647596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.647943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.648551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.648614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.649011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.649059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.651861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.651921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.653096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.653142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.653515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.655205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.655261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.656010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.656061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.658937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.658995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.659397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.659445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.659723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.661493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.661554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.661950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.662002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.664266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.664325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.665003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.665053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.665447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.665954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.666011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.667511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.667556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.670596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.670665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.672295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.672339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.672613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.673113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.673170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.673560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.673609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.675493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.675552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.675948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.675998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.676272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.677945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.678002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.679227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.679272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.681522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.681586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.681984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.682033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.682425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.682931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.682993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.683382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.683430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.686216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.686276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.686876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.686925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.687248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.688633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.688693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.689083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.689134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.691998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.692057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.693618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.693667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.694054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.695381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.695439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.696835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.696880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.698547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.700110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.701500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.912 [2024-07-26 06:02:35.701548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.701878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.703561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.704217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.705529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.705577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.707131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.707190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.707231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.707285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.707751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.709530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.709583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.709629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.709681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.710900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.710951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.710993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.711033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.711308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.711463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.711513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.711553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.711594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.712805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.712858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.712917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.712960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.713350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.713506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.713553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.713595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.713635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.714851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.714903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.714944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.714985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.715257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.715410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.715455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.715505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.715550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.716793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.716845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.716891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.716936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.717209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.717363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.717417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.717460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.717506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.718910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.718963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.719003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.719044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.719375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.719529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.719574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.719615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:20.913 [2024-07-26 06:02:35.719663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:21.481 00:34:21.481 Latency(us) 00:34:21.481 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:21.481 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:21.481 Verification LBA range: start 0x0 length 0x100 00:34:21.481 crypto_ram : 5.83 43.92 2.74 0.00 0.00 2831956.59 65194.07 2669765.68 00:34:21.481 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:21.481 Verification LBA range: start 0x100 length 0x100 00:34:21.481 crypto_ram : 5.80 44.15 2.76 0.00 0.00 2811074.11 76135.74 2596821.26 00:34:21.481 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:21.481 Verification LBA range: start 0x0 length 0x100 00:34:21.481 crypto_ram2 : 5.83 43.91 2.74 0.00 0.00 2734327.54 64738.17 2684354.56 00:34:21.481 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:21.481 Verification LBA range: start 0x100 length 0x100 00:34:21.481 crypto_ram2 : 5.80 44.14 2.76 0.00 0.00 2714643.59 75679.83 2582232.38 00:34:21.481 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:21.481 Verification LBA range: start 0x0 length 0x100 00:34:21.481 crypto_ram3 : 5.58 273.98 17.12 0.00 0.00 418195.38 65649.98 598144.22 00:34:21.481 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:21.481 Verification LBA range: start 0x100 length 0x100 00:34:21.481 crypto_ram3 : 5.56 284.34 17.77 0.00 0.00 403019.17 12081.42 594497.00 00:34:21.481 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:21.482 Verification LBA range: start 0x0 length 0x100 00:34:21.482 crypto_ram4 : 5.68 291.04 18.19 0.00 0.00 382573.58 16070.57 536141.47 00:34:21.482 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:21.482 Verification LBA range: start 0x100 length 0x100 00:34:21.482 crypto_ram4 : 5.66 300.10 18.76 0.00 0.00 370820.41 1189.62 532494.25 00:34:21.482 =================================================================================================================== 00:34:21.482 Total : 1325.57 82.85 0.00 0.00 718394.55 1189.62 2684354.56 00:34:21.741 00:34:21.741 real 0m9.083s 00:34:21.741 user 0m17.239s 00:34:21.741 sys 0m0.438s 00:34:21.741 06:02:36 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:21.741 06:02:36 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:34:21.741 ************************************ 00:34:21.741 END TEST bdev_verify_big_io 00:34:21.741 ************************************ 00:34:21.741 06:02:36 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:21.741 06:02:36 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:21.741 06:02:36 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:21.741 06:02:36 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:21.741 06:02:36 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:22.000 ************************************ 00:34:22.000 START TEST bdev_write_zeroes 00:34:22.000 ************************************ 00:34:22.000 06:02:36 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:22.000 [2024-07-26 06:02:36.747747] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:22.000 [2024-07-26 06:02:36.747812] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1315613 ] 00:34:22.000 [2024-07-26 06:02:36.877246] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:22.260 [2024-07-26 06:02:36.982326] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:22.260 [2024-07-26 06:02:37.003678] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:22.260 [2024-07-26 06:02:37.011695] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:22.260 [2024-07-26 06:02:37.019714] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:22.260 [2024-07-26 06:02:37.132704] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:24.796 [2024-07-26 06:02:39.353269] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:24.796 [2024-07-26 06:02:39.353340] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:24.796 [2024-07-26 06:02:39.353356] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:24.796 [2024-07-26 06:02:39.361287] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:24.796 [2024-07-26 06:02:39.361307] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:24.796 [2024-07-26 06:02:39.361319] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:24.796 [2024-07-26 06:02:39.369320] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:24.796 [2024-07-26 06:02:39.369339] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:24.796 [2024-07-26 06:02:39.369350] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:24.796 [2024-07-26 06:02:39.377329] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:24.796 [2024-07-26 06:02:39.377347] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:24.796 [2024-07-26 06:02:39.377362] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:24.796 Running I/O for 1 seconds... 00:34:25.734 00:34:25.734 Latency(us) 00:34:25.734 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:25.734 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:25.734 crypto_ram : 1.03 1948.74 7.61 0.00 0.00 65181.79 5470.83 77959.35 00:34:25.734 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:25.734 crypto_ram2 : 1.03 1962.01 7.66 0.00 0.00 64457.83 5413.84 72488.51 00:34:25.734 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:25.734 crypto_ram3 : 1.02 14982.69 58.53 0.00 0.00 8416.75 2478.97 10884.67 00:34:25.734 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:25.734 crypto_ram4 : 1.02 15019.93 58.67 0.00 0.00 8369.95 2478.97 8776.13 00:34:25.734 =================================================================================================================== 00:34:25.734 Total : 33913.38 132.47 0.00 0.00 14928.69 2478.97 77959.35 00:34:26.303 00:34:26.303 real 0m4.247s 00:34:26.303 user 0m3.823s 00:34:26.303 sys 0m0.375s 00:34:26.303 06:02:40 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:26.303 06:02:40 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:26.303 ************************************ 00:34:26.303 END TEST bdev_write_zeroes 00:34:26.303 ************************************ 00:34:26.303 06:02:40 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:26.303 06:02:40 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:26.303 06:02:40 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:26.303 06:02:40 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:26.303 06:02:40 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:26.303 ************************************ 00:34:26.303 START TEST bdev_json_nonenclosed 00:34:26.303 ************************************ 00:34:26.303 06:02:41 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:26.303 [2024-07-26 06:02:41.069590] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:26.303 [2024-07-26 06:02:41.069654] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1316159 ] 00:34:26.303 [2024-07-26 06:02:41.196540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:26.562 [2024-07-26 06:02:41.294644] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:26.562 [2024-07-26 06:02:41.294716] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:26.562 [2024-07-26 06:02:41.294734] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:26.562 [2024-07-26 06:02:41.294746] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:26.562 00:34:26.562 real 0m0.388s 00:34:26.562 user 0m0.239s 00:34:26.562 sys 0m0.146s 00:34:26.562 06:02:41 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:34:26.562 06:02:41 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:26.562 06:02:41 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:26.562 ************************************ 00:34:26.562 END TEST bdev_json_nonenclosed 00:34:26.562 ************************************ 00:34:26.562 06:02:41 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:34:26.562 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # true 00:34:26.562 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:26.562 06:02:41 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:26.562 06:02:41 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:26.562 06:02:41 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:26.562 ************************************ 00:34:26.562 START TEST bdev_json_nonarray 00:34:26.562 ************************************ 00:34:26.563 06:02:41 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:26.822 [2024-07-26 06:02:41.519678] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:26.822 [2024-07-26 06:02:41.519740] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1316189 ] 00:34:26.822 [2024-07-26 06:02:41.647659] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:27.081 [2024-07-26 06:02:41.745710] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:27.081 [2024-07-26 06:02:41.745785] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:27.081 [2024-07-26 06:02:41.745803] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:27.081 [2024-07-26 06:02:41.745816] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:27.081 00:34:27.081 real 0m0.389s 00:34:27.081 user 0m0.240s 00:34:27.081 sys 0m0.146s 00:34:27.081 06:02:41 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:34:27.081 06:02:41 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:27.081 06:02:41 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:27.081 ************************************ 00:34:27.081 END TEST bdev_json_nonarray 00:34:27.081 ************************************ 00:34:27.081 06:02:41 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # true 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@786 -- # [[ crypto_aesni == bdev ]] 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@793 -- # [[ crypto_aesni == gpt ]] 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@797 -- # [[ crypto_aesni == crypto_sw ]] 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # cleanup 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:34:27.081 06:02:41 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:34:27.081 00:34:27.081 real 1m12.171s 00:34:27.081 user 2m39.867s 00:34:27.081 sys 0m8.963s 00:34:27.081 06:02:41 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:27.081 06:02:41 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:27.081 ************************************ 00:34:27.081 END TEST blockdev_crypto_aesni 00:34:27.081 ************************************ 00:34:27.081 06:02:41 -- common/autotest_common.sh@1142 -- # return 0 00:34:27.081 06:02:41 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:34:27.081 06:02:41 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:27.081 06:02:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:27.081 06:02:41 -- common/autotest_common.sh@10 -- # set +x 00:34:27.081 ************************************ 00:34:27.081 START TEST blockdev_crypto_sw 00:34:27.081 ************************************ 00:34:27.081 06:02:41 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:34:27.341 * Looking for test storage... 00:34:27.341 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # uname -s 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@681 -- # test_type=crypto_sw 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # crypto_device= 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # dek= 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # env_ctx= 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == bdev ]] 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == crypto_* ]] 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1316417 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:27.341 06:02:42 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1316417 00:34:27.341 06:02:42 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 1316417 ']' 00:34:27.341 06:02:42 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:27.341 06:02:42 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:27.341 06:02:42 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:27.341 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:27.341 06:02:42 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:27.341 06:02:42 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:27.341 [2024-07-26 06:02:42.145200] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:27.341 [2024-07-26 06:02:42.145259] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1316417 ] 00:34:27.600 [2024-07-26 06:02:42.254955] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:27.600 [2024-07-26 06:02:42.352274] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:28.170 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:28.170 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:34:28.467 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:34:28.467 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@710 -- # setup_crypto_sw_conf 00:34:28.467 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@192 -- # rpc_cmd 00:34:28.467 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.467 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:28.467 Malloc0 00:34:28.467 Malloc1 00:34:28.467 true 00:34:28.467 true 00:34:28.467 true 00:34:28.467 [2024-07-26 06:02:43.343649] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:28.467 crypto_ram 00:34:28.467 [2024-07-26 06:02:43.351679] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:28.467 crypto_ram2 00:34:28.467 [2024-07-26 06:02:43.359702] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:28.467 crypto_ram3 00:34:28.467 [ 00:34:28.467 { 00:34:28.467 "name": "Malloc1", 00:34:28.467 "aliases": [ 00:34:28.467 "4fefbf1a-53a1-46b7-9d37-5da12cc55171" 00:34:28.467 ], 00:34:28.467 "product_name": "Malloc disk", 00:34:28.467 "block_size": 4096, 00:34:28.467 "num_blocks": 4096, 00:34:28.467 "uuid": "4fefbf1a-53a1-46b7-9d37-5da12cc55171", 00:34:28.726 "assigned_rate_limits": { 00:34:28.726 "rw_ios_per_sec": 0, 00:34:28.726 "rw_mbytes_per_sec": 0, 00:34:28.726 "r_mbytes_per_sec": 0, 00:34:28.726 "w_mbytes_per_sec": 0 00:34:28.726 }, 00:34:28.726 "claimed": true, 00:34:28.726 "claim_type": "exclusive_write", 00:34:28.726 "zoned": false, 00:34:28.726 "supported_io_types": { 00:34:28.726 "read": true, 00:34:28.726 "write": true, 00:34:28.726 "unmap": true, 00:34:28.726 "flush": true, 00:34:28.726 "reset": true, 00:34:28.726 "nvme_admin": false, 00:34:28.726 "nvme_io": false, 00:34:28.726 "nvme_io_md": false, 00:34:28.726 "write_zeroes": true, 00:34:28.726 "zcopy": true, 00:34:28.726 "get_zone_info": false, 00:34:28.726 "zone_management": false, 00:34:28.726 "zone_append": false, 00:34:28.726 "compare": false, 00:34:28.726 "compare_and_write": false, 00:34:28.726 "abort": true, 00:34:28.726 "seek_hole": false, 00:34:28.726 "seek_data": false, 00:34:28.726 "copy": true, 00:34:28.726 "nvme_iov_md": false 00:34:28.726 }, 00:34:28.726 "memory_domains": [ 00:34:28.726 { 00:34:28.726 "dma_device_id": "system", 00:34:28.726 "dma_device_type": 1 00:34:28.726 }, 00:34:28.726 { 00:34:28.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:28.726 "dma_device_type": 2 00:34:28.726 } 00:34:28.726 ], 00:34:28.726 "driver_specific": {} 00:34:28.726 } 00:34:28.726 ] 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # cat 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r .name 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "56ea7e59-6291-5c81-b13b-3523adb89b07"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "56ea7e59-6291-5c81-b13b-3523adb89b07",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "39fe11cb-f8a7-595f-b3e7-5f1f222d4a75"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "39fe11cb-f8a7-595f-b3e7-5f1f222d4a75",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:34:28.726 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # killprocess 1316417 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 1316417 ']' 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 1316417 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1316417 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1316417' 00:34:28.726 killing process with pid 1316417 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 1316417 00:34:28.726 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 1316417 00:34:29.294 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:29.294 06:02:43 blockdev_crypto_sw -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:29.294 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:29.294 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:29.294 06:02:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:29.294 ************************************ 00:34:29.294 START TEST bdev_hello_world 00:34:29.294 ************************************ 00:34:29.294 06:02:44 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:29.294 [2024-07-26 06:02:44.065892] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:29.294 [2024-07-26 06:02:44.065950] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1316618 ] 00:34:29.294 [2024-07-26 06:02:44.195941] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:29.553 [2024-07-26 06:02:44.297317] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:29.812 [2024-07-26 06:02:44.478932] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:29.812 [2024-07-26 06:02:44.478991] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:29.812 [2024-07-26 06:02:44.479006] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:29.812 [2024-07-26 06:02:44.486950] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:29.812 [2024-07-26 06:02:44.486970] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:29.812 [2024-07-26 06:02:44.486982] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:29.812 [2024-07-26 06:02:44.494971] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:29.812 [2024-07-26 06:02:44.494989] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:29.812 [2024-07-26 06:02:44.495001] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:29.812 [2024-07-26 06:02:44.536364] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:29.812 [2024-07-26 06:02:44.536402] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:29.812 [2024-07-26 06:02:44.536419] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:29.812 [2024-07-26 06:02:44.537688] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:29.812 [2024-07-26 06:02:44.537756] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:29.812 [2024-07-26 06:02:44.537771] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:29.812 [2024-07-26 06:02:44.537805] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:29.812 00:34:29.812 [2024-07-26 06:02:44.537822] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:30.072 00:34:30.072 real 0m0.747s 00:34:30.072 user 0m0.496s 00:34:30.072 sys 0m0.233s 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:30.072 ************************************ 00:34:30.072 END TEST bdev_hello_world 00:34:30.072 ************************************ 00:34:30.072 06:02:44 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:30.072 06:02:44 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:34:30.072 06:02:44 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:30.072 06:02:44 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:30.072 06:02:44 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:30.072 ************************************ 00:34:30.072 START TEST bdev_bounds 00:34:30.072 ************************************ 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1316809 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1316809' 00:34:30.072 Process bdevio pid: 1316809 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1316809 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1316809 ']' 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:30.072 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:30.072 06:02:44 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:30.072 [2024-07-26 06:02:44.893843] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:30.072 [2024-07-26 06:02:44.893904] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1316809 ] 00:34:30.331 [2024-07-26 06:02:45.022852] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:30.331 [2024-07-26 06:02:45.128581] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:30.331 [2024-07-26 06:02:45.128605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:30.331 [2024-07-26 06:02:45.128610] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:30.590 [2024-07-26 06:02:45.294570] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:30.590 [2024-07-26 06:02:45.294628] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:30.590 [2024-07-26 06:02:45.294649] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:30.590 [2024-07-26 06:02:45.302578] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:30.590 [2024-07-26 06:02:45.302597] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:30.590 [2024-07-26 06:02:45.302609] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:30.590 [2024-07-26 06:02:45.310602] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:30.590 [2024-07-26 06:02:45.310620] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:30.590 [2024-07-26 06:02:45.310632] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:31.159 06:02:45 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:31.159 06:02:45 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:34:31.159 06:02:45 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:31.159 I/O targets: 00:34:31.159 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:34:31.159 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:34:31.159 00:34:31.159 00:34:31.159 CUnit - A unit testing framework for C - Version 2.1-3 00:34:31.159 http://cunit.sourceforge.net/ 00:34:31.159 00:34:31.159 00:34:31.159 Suite: bdevio tests on: crypto_ram3 00:34:31.159 Test: blockdev write read block ...passed 00:34:31.159 Test: blockdev write zeroes read block ...passed 00:34:31.159 Test: blockdev write zeroes read no split ...passed 00:34:31.159 Test: blockdev write zeroes read split ...passed 00:34:31.159 Test: blockdev write zeroes read split partial ...passed 00:34:31.159 Test: blockdev reset ...passed 00:34:31.159 Test: blockdev write read 8 blocks ...passed 00:34:31.159 Test: blockdev write read size > 128k ...passed 00:34:31.159 Test: blockdev write read invalid size ...passed 00:34:31.159 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:31.159 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:31.159 Test: blockdev write read max offset ...passed 00:34:31.159 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:31.159 Test: blockdev writev readv 8 blocks ...passed 00:34:31.159 Test: blockdev writev readv 30 x 1block ...passed 00:34:31.159 Test: blockdev writev readv block ...passed 00:34:31.159 Test: blockdev writev readv size > 128k ...passed 00:34:31.159 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:31.159 Test: blockdev comparev and writev ...passed 00:34:31.159 Test: blockdev nvme passthru rw ...passed 00:34:31.159 Test: blockdev nvme passthru vendor specific ...passed 00:34:31.159 Test: blockdev nvme admin passthru ...passed 00:34:31.159 Test: blockdev copy ...passed 00:34:31.159 Suite: bdevio tests on: crypto_ram 00:34:31.159 Test: blockdev write read block ...passed 00:34:31.159 Test: blockdev write zeroes read block ...passed 00:34:31.159 Test: blockdev write zeroes read no split ...passed 00:34:31.159 Test: blockdev write zeroes read split ...passed 00:34:31.159 Test: blockdev write zeroes read split partial ...passed 00:34:31.159 Test: blockdev reset ...passed 00:34:31.159 Test: blockdev write read 8 blocks ...passed 00:34:31.159 Test: blockdev write read size > 128k ...passed 00:34:31.159 Test: blockdev write read invalid size ...passed 00:34:31.159 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:31.159 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:31.159 Test: blockdev write read max offset ...passed 00:34:31.159 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:31.159 Test: blockdev writev readv 8 blocks ...passed 00:34:31.159 Test: blockdev writev readv 30 x 1block ...passed 00:34:31.159 Test: blockdev writev readv block ...passed 00:34:31.159 Test: blockdev writev readv size > 128k ...passed 00:34:31.159 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:31.159 Test: blockdev comparev and writev ...passed 00:34:31.159 Test: blockdev nvme passthru rw ...passed 00:34:31.159 Test: blockdev nvme passthru vendor specific ...passed 00:34:31.159 Test: blockdev nvme admin passthru ...passed 00:34:31.159 Test: blockdev copy ...passed 00:34:31.159 00:34:31.159 Run Summary: Type Total Ran Passed Failed Inactive 00:34:31.159 suites 2 2 n/a 0 0 00:34:31.159 tests 46 46 46 0 0 00:34:31.159 asserts 260 260 260 0 n/a 00:34:31.159 00:34:31.159 Elapsed time = 0.083 seconds 00:34:31.159 0 00:34:31.159 06:02:46 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1316809 00:34:31.159 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1316809 ']' 00:34:31.159 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1316809 00:34:31.159 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:34:31.160 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:31.160 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1316809 00:34:31.160 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:31.160 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:31.160 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1316809' 00:34:31.160 killing process with pid 1316809 00:34:31.160 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1316809 00:34:31.160 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1316809 00:34:31.419 06:02:46 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:34:31.419 00:34:31.419 real 0m1.448s 00:34:31.419 user 0m3.770s 00:34:31.419 sys 0m0.386s 00:34:31.419 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:31.419 06:02:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:31.419 ************************************ 00:34:31.419 END TEST bdev_bounds 00:34:31.419 ************************************ 00:34:31.678 06:02:46 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:31.678 06:02:46 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:34:31.678 06:02:46 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:34:31.678 06:02:46 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:31.678 06:02:46 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:31.678 ************************************ 00:34:31.678 START TEST bdev_nbd 00:34:31.678 ************************************ 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=2 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=2 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1317017 00:34:31.678 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:31.679 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:31.679 06:02:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1317017 /var/tmp/spdk-nbd.sock 00:34:31.679 06:02:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1317017 ']' 00:34:31.679 06:02:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:31.679 06:02:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:31.679 06:02:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:31.679 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:31.679 06:02:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:31.679 06:02:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:31.679 [2024-07-26 06:02:46.441690] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:31.679 [2024-07-26 06:02:46.441744] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:31.679 [2024-07-26 06:02:46.554833] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:31.938 [2024-07-26 06:02:46.658368] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:31.938 [2024-07-26 06:02:46.833038] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:31.938 [2024-07-26 06:02:46.833117] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:31.938 [2024-07-26 06:02:46.833133] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:31.938 [2024-07-26 06:02:46.841057] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:31.938 [2024-07-26 06:02:46.841078] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:31.938 [2024-07-26 06:02:46.841090] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:32.197 [2024-07-26 06:02:46.849077] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:32.197 [2024-07-26 06:02:46.849096] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:32.198 [2024-07-26 06:02:46.849107] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:34:32.456 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:32.715 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:32.716 1+0 records in 00:34:32.716 1+0 records out 00:34:32.716 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229746 s, 17.8 MB/s 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:34:32.716 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:32.975 1+0 records in 00:34:32.975 1+0 records out 00:34:32.975 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236451 s, 17.3 MB/s 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:34:32.975 06:02:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:33.234 { 00:34:33.234 "nbd_device": "/dev/nbd0", 00:34:33.234 "bdev_name": "crypto_ram" 00:34:33.234 }, 00:34:33.234 { 00:34:33.234 "nbd_device": "/dev/nbd1", 00:34:33.234 "bdev_name": "crypto_ram3" 00:34:33.234 } 00:34:33.234 ]' 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:33.234 { 00:34:33.234 "nbd_device": "/dev/nbd0", 00:34:33.234 "bdev_name": "crypto_ram" 00:34:33.234 }, 00:34:33.234 { 00:34:33.234 "nbd_device": "/dev/nbd1", 00:34:33.234 "bdev_name": "crypto_ram3" 00:34:33.234 } 00:34:33.234 ]' 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:33.234 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:33.493 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:33.752 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:33.753 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:34.012 06:02:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:34.271 /dev/nbd0 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:34.272 1+0 records in 00:34:34.272 1+0 records out 00:34:34.272 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000177087 s, 23.1 MB/s 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:34.272 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:34:34.531 /dev/nbd1 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:34.531 1+0 records in 00:34:34.531 1+0 records out 00:34:34.531 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000332071 s, 12.3 MB/s 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:34.531 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:34.790 { 00:34:34.790 "nbd_device": "/dev/nbd0", 00:34:34.790 "bdev_name": "crypto_ram" 00:34:34.790 }, 00:34:34.790 { 00:34:34.790 "nbd_device": "/dev/nbd1", 00:34:34.790 "bdev_name": "crypto_ram3" 00:34:34.790 } 00:34:34.790 ]' 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:34.790 { 00:34:34.790 "nbd_device": "/dev/nbd0", 00:34:34.790 "bdev_name": "crypto_ram" 00:34:34.790 }, 00:34:34.790 { 00:34:34.790 "nbd_device": "/dev/nbd1", 00:34:34.790 "bdev_name": "crypto_ram3" 00:34:34.790 } 00:34:34.790 ]' 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:34.790 /dev/nbd1' 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:34.790 /dev/nbd1' 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:34:34.790 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:34.791 256+0 records in 00:34:34.791 256+0 records out 00:34:34.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114718 s, 91.4 MB/s 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:34.791 256+0 records in 00:34:34.791 256+0 records out 00:34:34.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0205751 s, 51.0 MB/s 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:34.791 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:35.050 256+0 records in 00:34:35.050 256+0 records out 00:34:35.050 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0416221 s, 25.2 MB/s 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:35.050 06:02:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:35.308 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:35.566 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:35.566 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:35.567 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:35.825 malloc_lvol_verify 00:34:35.825 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:36.084 d925e6a3-d93d-47b7-b1c1-c3e1ba2dc4a7 00:34:36.084 06:02:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:36.343 a348f000-e8cc-430f-af45-a6287ac0e675 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:36.343 /dev/nbd0 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:36.343 mke2fs 1.46.5 (30-Dec-2021) 00:34:36.343 Discarding device blocks: 0/4096 done 00:34:36.343 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:36.343 00:34:36.343 Allocating group tables: 0/1 done 00:34:36.343 Writing inode tables: 0/1 done 00:34:36.343 Creating journal (1024 blocks): done 00:34:36.343 Writing superblocks and filesystem accounting information: 0/1 done 00:34:36.343 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:36.343 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1317017 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1317017 ']' 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1317017 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1317017 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1317017' 00:34:36.602 killing process with pid 1317017 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1317017 00:34:36.602 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1317017 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:34:37.171 00:34:37.171 real 0m5.436s 00:34:37.171 user 0m7.495s 00:34:37.171 sys 0m2.210s 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:37.171 ************************************ 00:34:37.171 END TEST bdev_nbd 00:34:37.171 ************************************ 00:34:37.171 06:02:51 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:37.171 06:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:34:37.171 06:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = nvme ']' 00:34:37.171 06:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = gpt ']' 00:34:37.171 06:02:51 blockdev_crypto_sw -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:34:37.171 06:02:51 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:37.171 06:02:51 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:37.171 06:02:51 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:37.171 ************************************ 00:34:37.171 START TEST bdev_fio 00:34:37.171 ************************************ 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:37.171 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:37.171 ************************************ 00:34:37.171 START TEST bdev_fio_rw_verify 00:34:37.171 ************************************ 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:37.171 06:02:51 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:37.171 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:37.429 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:37.429 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:37.429 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:37.429 06:02:52 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:37.687 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:37.687 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:37.687 fio-3.35 00:34:37.687 Starting 2 threads 00:34:49.880 00:34:49.880 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1317992: Fri Jul 26 06:03:02 2024 00:34:49.880 read: IOPS=21.7k, BW=84.9MiB/s (89.0MB/s)(849MiB/10001msec) 00:34:49.880 slat (usec): min=14, max=263, avg=20.16, stdev= 3.62 00:34:49.881 clat (usec): min=7, max=548, avg=146.31, stdev=58.00 00:34:49.881 lat (usec): min=26, max=573, avg=166.47, stdev=59.39 00:34:49.881 clat percentiles (usec): 00:34:49.881 | 50.000th=[ 143], 99.000th=[ 277], 99.900th=[ 297], 99.990th=[ 343], 00:34:49.881 | 99.999th=[ 510] 00:34:49.881 write: IOPS=26.1k, BW=102MiB/s (107MB/s)(967MiB/9479msec); 0 zone resets 00:34:49.881 slat (usec): min=14, max=433, avg=33.91, stdev= 4.35 00:34:49.881 clat (usec): min=13, max=877, avg=196.44, stdev=89.56 00:34:49.881 lat (usec): min=39, max=962, avg=230.35, stdev=91.12 00:34:49.881 clat percentiles (usec): 00:34:49.881 | 50.000th=[ 192], 99.000th=[ 388], 99.900th=[ 408], 99.990th=[ 660], 00:34:49.881 | 99.999th=[ 857] 00:34:49.881 bw ( KiB/s): min=92688, max=105424, per=94.86%, avg=99127.16, stdev=1792.50, samples=38 00:34:49.881 iops : min=23172, max=26356, avg=24781.79, stdev=448.13, samples=38 00:34:49.881 lat (usec) : 10=0.01%, 20=0.01%, 50=4.34%, 100=14.90%, 250=63.16% 00:34:49.881 lat (usec) : 500=17.58%, 750=0.01%, 1000=0.01% 00:34:49.881 cpu : usr=99.60%, sys=0.01%, ctx=43, majf=0, minf=469 00:34:49.881 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:49.881 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:49.881 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:49.881 issued rwts: total=217389,247636,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:49.881 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:49.881 00:34:49.881 Run status group 0 (all jobs): 00:34:49.881 READ: bw=84.9MiB/s (89.0MB/s), 84.9MiB/s-84.9MiB/s (89.0MB/s-89.0MB/s), io=849MiB (890MB), run=10001-10001msec 00:34:49.881 WRITE: bw=102MiB/s (107MB/s), 102MiB/s-102MiB/s (107MB/s-107MB/s), io=967MiB (1014MB), run=9479-9479msec 00:34:49.881 00:34:49.881 real 0m11.095s 00:34:49.881 user 0m23.498s 00:34:49.881 sys 0m0.338s 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:49.881 ************************************ 00:34:49.881 END TEST bdev_fio_rw_verify 00:34:49.881 ************************************ 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "56ea7e59-6291-5c81-b13b-3523adb89b07"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "56ea7e59-6291-5c81-b13b-3523adb89b07",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "39fe11cb-f8a7-595f-b3e7-5f1f222d4a75"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "39fe11cb-f8a7-595f-b3e7-5f1f222d4a75",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:34:49.881 crypto_ram3 ]] 00:34:49.881 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "56ea7e59-6291-5c81-b13b-3523adb89b07"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "56ea7e59-6291-5c81-b13b-3523adb89b07",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "39fe11cb-f8a7-595f-b3e7-5f1f222d4a75"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "39fe11cb-f8a7-595f-b3e7-5f1f222d4a75",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:49.882 ************************************ 00:34:49.882 START TEST bdev_fio_trim 00:34:49.882 ************************************ 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:49.882 06:03:03 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:49.882 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:49.882 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:49.882 fio-3.35 00:34:49.882 Starting 2 threads 00:34:59.892 00:34:59.892 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1319708: Fri Jul 26 06:03:14 2024 00:34:59.892 write: IOPS=47.0k, BW=184MiB/s (193MB/s)(1838MiB/10001msec); 0 zone resets 00:34:59.892 slat (usec): min=9, max=2104, avg=18.13, stdev= 7.15 00:34:59.892 clat (usec): min=24, max=2428, avg=139.39, stdev=103.37 00:34:59.892 lat (usec): min=34, max=2452, avg=157.52, stdev=109.21 00:34:59.892 clat percentiles (usec): 00:34:59.892 | 50.000th=[ 90], 99.000th=[ 338], 99.900th=[ 355], 99.990th=[ 461], 00:34:59.892 | 99.999th=[ 725] 00:34:59.892 bw ( KiB/s): min=183504, max=190936, per=100.00%, avg=188473.68, stdev=1313.00, samples=38 00:34:59.892 iops : min=45876, max=47734, avg=47118.42, stdev=328.25, samples=38 00:34:59.892 trim: IOPS=47.0k, BW=184MiB/s (193MB/s)(1838MiB/10001msec); 0 zone resets 00:34:59.892 slat (nsec): min=4072, max=97626, avg=9174.62, stdev=3255.73 00:34:59.892 clat (usec): min=34, max=596, avg=92.29, stdev=30.49 00:34:59.892 lat (usec): min=38, max=602, avg=101.47, stdev=31.93 00:34:59.892 clat percentiles (usec): 00:34:59.892 | 50.000th=[ 91], 99.000th=[ 161], 99.900th=[ 174], 99.990th=[ 247], 00:34:59.892 | 99.999th=[ 510] 00:34:59.892 bw ( KiB/s): min=183528, max=190936, per=100.00%, avg=188474.95, stdev=1312.78, samples=38 00:34:59.892 iops : min=45882, max=47734, avg=47118.74, stdev=328.20, samples=38 00:34:59.892 lat (usec) : 50=12.57%, 100=46.33%, 250=27.98%, 500=13.11%, 750=0.01% 00:34:59.892 lat (usec) : 1000=0.01% 00:34:59.892 lat (msec) : 4=0.01% 00:34:59.892 cpu : usr=99.59%, sys=0.01%, ctx=28, majf=0, minf=335 00:34:59.892 IO depths : 1=8.4%, 2=18.9%, 4=58.1%, 8=14.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:59.892 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:59.892 complete : 0=0.0%, 4=87.3%, 8=12.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:59.892 issued rwts: total=0,470464,470464,0 short=0,0,0,0 dropped=0,0,0,0 00:34:59.892 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:59.892 00:34:59.892 Run status group 0 (all jobs): 00:34:59.892 WRITE: bw=184MiB/s (193MB/s), 184MiB/s-184MiB/s (193MB/s-193MB/s), io=1838MiB (1927MB), run=10001-10001msec 00:34:59.892 TRIM: bw=184MiB/s (193MB/s), 184MiB/s-184MiB/s (193MB/s-193MB/s), io=1838MiB (1927MB), run=10001-10001msec 00:34:59.892 00:34:59.892 real 0m11.142s 00:34:59.892 user 0m23.810s 00:34:59.892 sys 0m0.341s 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:59.892 ************************************ 00:34:59.892 END TEST bdev_fio_trim 00:34:59.892 ************************************ 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:34:59.892 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:34:59.892 00:34:59.892 real 0m22.582s 00:34:59.892 user 0m47.489s 00:34:59.892 sys 0m0.865s 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:59.892 ************************************ 00:34:59.892 END TEST bdev_fio 00:34:59.892 ************************************ 00:34:59.892 06:03:14 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:59.892 06:03:14 blockdev_crypto_sw -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:59.892 06:03:14 blockdev_crypto_sw -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:59.892 06:03:14 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:59.892 06:03:14 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:59.892 06:03:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:59.892 ************************************ 00:34:59.892 START TEST bdev_verify 00:34:59.892 ************************************ 00:34:59.892 06:03:14 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:59.892 [2024-07-26 06:03:14.625338] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:34:59.892 [2024-07-26 06:03:14.625401] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1321386 ] 00:34:59.892 [2024-07-26 06:03:14.756319] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:00.151 [2024-07-26 06:03:14.865052] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:00.151 [2024-07-26 06:03:14.865056] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:00.151 [2024-07-26 06:03:15.056808] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:00.151 [2024-07-26 06:03:15.056874] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:00.151 [2024-07-26 06:03:15.056889] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:00.408 [2024-07-26 06:03:15.064834] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:00.408 [2024-07-26 06:03:15.064857] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:00.408 [2024-07-26 06:03:15.064869] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:00.408 [2024-07-26 06:03:15.072851] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:00.408 [2024-07-26 06:03:15.072870] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:35:00.408 [2024-07-26 06:03:15.072882] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:00.408 Running I/O for 5 seconds... 00:35:05.679 00:35:05.679 Latency(us) 00:35:05.679 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:05.679 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:05.679 Verification LBA range: start 0x0 length 0x800 00:35:05.679 crypto_ram : 5.01 6494.63 25.37 0.00 0.00 19634.85 1659.77 23251.03 00:35:05.679 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:05.679 Verification LBA range: start 0x800 length 0x800 00:35:05.679 crypto_ram : 5.02 6530.71 25.51 0.00 0.00 19530.72 1809.36 23137.06 00:35:05.679 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:05.679 Verification LBA range: start 0x0 length 0x800 00:35:05.679 crypto_ram3 : 5.02 3262.56 12.74 0.00 0.00 39029.44 1894.85 24960.67 00:35:05.679 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:05.679 Verification LBA range: start 0x800 length 0x800 00:35:05.679 crypto_ram3 : 5.02 3263.73 12.75 0.00 0.00 39005.72 7579.38 24276.81 00:35:05.679 =================================================================================================================== 00:35:05.679 Total : 19551.63 76.37 0.00 0.00 26077.83 1659.77 24960.67 00:35:05.679 00:35:05.679 real 0m5.840s 00:35:05.679 user 0m10.938s 00:35:05.679 sys 0m0.257s 00:35:05.679 06:03:20 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:05.679 06:03:20 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:35:05.679 ************************************ 00:35:05.679 END TEST bdev_verify 00:35:05.679 ************************************ 00:35:05.679 06:03:20 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:05.679 06:03:20 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:05.679 06:03:20 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:05.679 06:03:20 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:05.679 06:03:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:05.679 ************************************ 00:35:05.679 START TEST bdev_verify_big_io 00:35:05.679 ************************************ 00:35:05.679 06:03:20 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:05.679 [2024-07-26 06:03:20.552587] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:35:05.679 [2024-07-26 06:03:20.552667] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1322173 ] 00:35:05.939 [2024-07-26 06:03:20.683356] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:05.939 [2024-07-26 06:03:20.791686] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:05.939 [2024-07-26 06:03:20.791692] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:06.198 [2024-07-26 06:03:20.973899] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:06.198 [2024-07-26 06:03:20.973965] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:06.198 [2024-07-26 06:03:20.973980] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:06.198 [2024-07-26 06:03:20.981916] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:06.198 [2024-07-26 06:03:20.981935] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:06.198 [2024-07-26 06:03:20.981947] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:06.198 [2024-07-26 06:03:20.989938] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:06.198 [2024-07-26 06:03:20.989957] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:35:06.198 [2024-07-26 06:03:20.989969] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:06.198 Running I/O for 5 seconds... 00:35:11.464 00:35:11.464 Latency(us) 00:35:11.464 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:11.464 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:11.464 Verification LBA range: start 0x0 length 0x80 00:35:11.464 crypto_ram : 5.10 426.71 26.67 0.00 0.00 292441.60 6240.17 403017.91 00:35:11.464 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:11.464 Verification LBA range: start 0x80 length 0x80 00:35:11.465 crypto_ram : 5.08 428.48 26.78 0.00 0.00 291393.13 6126.19 399370.69 00:35:11.465 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:11.465 Verification LBA range: start 0x0 length 0x80 00:35:11.465 crypto_ram3 : 5.27 242.78 15.17 0.00 0.00 493329.49 5755.77 412135.96 00:35:11.465 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:11.465 Verification LBA range: start 0x80 length 0x80 00:35:11.465 crypto_ram3 : 5.25 243.67 15.23 0.00 0.00 491760.14 5670.29 410312.35 00:35:11.465 =================================================================================================================== 00:35:11.465 Total : 1341.64 83.85 0.00 0.00 366223.83 5670.29 412135.96 00:35:11.724 00:35:11.724 real 0m6.079s 00:35:11.724 user 0m11.417s 00:35:11.724 sys 0m0.262s 00:35:11.724 06:03:26 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:11.724 06:03:26 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:35:11.724 ************************************ 00:35:11.724 END TEST bdev_verify_big_io 00:35:11.724 ************************************ 00:35:11.725 06:03:26 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:11.725 06:03:26 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:11.725 06:03:26 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:11.725 06:03:26 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:11.725 06:03:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:11.984 ************************************ 00:35:11.984 START TEST bdev_write_zeroes 00:35:11.984 ************************************ 00:35:11.984 06:03:26 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:11.984 [2024-07-26 06:03:26.725193] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:35:11.984 [2024-07-26 06:03:26.725260] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1322983 ] 00:35:11.984 [2024-07-26 06:03:26.858446] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:12.243 [2024-07-26 06:03:26.962446] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:12.243 [2024-07-26 06:03:27.135233] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:12.243 [2024-07-26 06:03:27.135304] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:12.243 [2024-07-26 06:03:27.135320] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:12.243 [2024-07-26 06:03:27.143251] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:12.243 [2024-07-26 06:03:27.143272] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:12.243 [2024-07-26 06:03:27.143283] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:12.502 [2024-07-26 06:03:27.151272] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:12.502 [2024-07-26 06:03:27.151296] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:35:12.502 [2024-07-26 06:03:27.151308] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:12.502 Running I/O for 1 seconds... 00:35:13.435 00:35:13.435 Latency(us) 00:35:13.435 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:13.435 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:13.435 crypto_ram : 1.01 26211.52 102.39 0.00 0.00 4871.03 1296.47 6667.58 00:35:13.435 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:13.435 crypto_ram3 : 1.01 13151.14 51.37 0.00 0.00 9664.12 1652.65 9972.87 00:35:13.435 =================================================================================================================== 00:35:13.435 Total : 39362.65 153.76 0.00 0.00 6479.03 1296.47 9972.87 00:35:13.694 00:35:13.694 real 0m1.762s 00:35:13.694 user 0m1.516s 00:35:13.694 sys 0m0.228s 00:35:13.694 06:03:28 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:13.694 06:03:28 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:13.694 ************************************ 00:35:13.694 END TEST bdev_write_zeroes 00:35:13.694 ************************************ 00:35:13.694 06:03:28 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:13.694 06:03:28 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:13.694 06:03:28 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:13.694 06:03:28 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:13.694 06:03:28 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:13.694 ************************************ 00:35:13.694 START TEST bdev_json_nonenclosed 00:35:13.694 ************************************ 00:35:13.694 06:03:28 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:13.694 [2024-07-26 06:03:28.553941] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:35:13.694 [2024-07-26 06:03:28.554001] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1323186 ] 00:35:13.953 [2024-07-26 06:03:28.681987] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:13.953 [2024-07-26 06:03:28.779421] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:13.953 [2024-07-26 06:03:28.779493] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:13.953 [2024-07-26 06:03:28.779511] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:13.953 [2024-07-26 06:03:28.779523] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:14.211 00:35:14.211 real 0m0.380s 00:35:14.211 user 0m0.237s 00:35:14.211 sys 0m0.140s 00:35:14.211 06:03:28 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:35:14.211 06:03:28 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:14.211 06:03:28 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:14.211 ************************************ 00:35:14.211 END TEST bdev_json_nonenclosed 00:35:14.211 ************************************ 00:35:14.211 06:03:28 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:35:14.211 06:03:28 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # true 00:35:14.211 06:03:28 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:14.211 06:03:28 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:14.211 06:03:28 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:14.211 06:03:28 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:14.211 ************************************ 00:35:14.211 START TEST bdev_json_nonarray 00:35:14.211 ************************************ 00:35:14.211 06:03:28 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:14.211 [2024-07-26 06:03:29.029789] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:35:14.211 [2024-07-26 06:03:29.029853] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1323318 ] 00:35:14.469 [2024-07-26 06:03:29.158263] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:14.469 [2024-07-26 06:03:29.258802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:14.469 [2024-07-26 06:03:29.258882] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:14.469 [2024-07-26 06:03:29.258901] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:14.469 [2024-07-26 06:03:29.258914] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:14.469 00:35:14.469 real 0m0.396s 00:35:14.469 user 0m0.239s 00:35:14.469 sys 0m0.154s 00:35:14.469 06:03:29 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:35:14.469 06:03:29 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:14.469 06:03:29 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:14.469 ************************************ 00:35:14.469 END TEST bdev_json_nonarray 00:35:14.469 ************************************ 00:35:14.728 06:03:29 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:35:14.728 06:03:29 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # true 00:35:14.728 06:03:29 blockdev_crypto_sw -- bdev/blockdev.sh@786 -- # [[ crypto_sw == bdev ]] 00:35:14.728 06:03:29 blockdev_crypto_sw -- bdev/blockdev.sh@793 -- # [[ crypto_sw == gpt ]] 00:35:14.728 06:03:29 blockdev_crypto_sw -- bdev/blockdev.sh@797 -- # [[ crypto_sw == crypto_sw ]] 00:35:14.728 06:03:29 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:35:14.728 06:03:29 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:35:14.728 06:03:29 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:14.728 06:03:29 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:14.728 ************************************ 00:35:14.728 START TEST bdev_crypto_enomem 00:35:14.728 ************************************ 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@634 -- # local base_dev=base0 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local test_dev=crypt0 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local err_dev=EE_base0 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local qd=32 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # ERR_PID=1323395 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # waitforlisten 1323395 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 1323395 ']' 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:14.728 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:14.728 06:03:29 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:14.728 [2024-07-26 06:03:29.512790] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:35:14.728 [2024-07-26 06:03:29.512842] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1323395 ] 00:35:14.728 [2024-07-26 06:03:29.616187] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:14.987 [2024-07-26 06:03:29.716215] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:15.555 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:15.555 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:35:15.555 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@644 -- # rpc_cmd 00:35:15.555 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:15.555 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:15.555 true 00:35:15.814 base0 00:35:15.814 true 00:35:15.814 [2024-07-26 06:03:30.476265] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:15.814 crypt0 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@651 -- # waitforbdev crypt0 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:15.814 [ 00:35:15.814 { 00:35:15.814 "name": "crypt0", 00:35:15.814 "aliases": [ 00:35:15.814 "c81b38f8-ac56-5521-941d-bf361f8b9445" 00:35:15.814 ], 00:35:15.814 "product_name": "crypto", 00:35:15.814 "block_size": 512, 00:35:15.814 "num_blocks": 2097152, 00:35:15.814 "uuid": "c81b38f8-ac56-5521-941d-bf361f8b9445", 00:35:15.814 "assigned_rate_limits": { 00:35:15.814 "rw_ios_per_sec": 0, 00:35:15.814 "rw_mbytes_per_sec": 0, 00:35:15.814 "r_mbytes_per_sec": 0, 00:35:15.814 "w_mbytes_per_sec": 0 00:35:15.814 }, 00:35:15.814 "claimed": false, 00:35:15.814 "zoned": false, 00:35:15.814 "supported_io_types": { 00:35:15.814 "read": true, 00:35:15.814 "write": true, 00:35:15.814 "unmap": false, 00:35:15.814 "flush": false, 00:35:15.814 "reset": true, 00:35:15.814 "nvme_admin": false, 00:35:15.814 "nvme_io": false, 00:35:15.814 "nvme_io_md": false, 00:35:15.814 "write_zeroes": true, 00:35:15.814 "zcopy": false, 00:35:15.814 "get_zone_info": false, 00:35:15.814 "zone_management": false, 00:35:15.814 "zone_append": false, 00:35:15.814 "compare": false, 00:35:15.814 "compare_and_write": false, 00:35:15.814 "abort": false, 00:35:15.814 "seek_hole": false, 00:35:15.814 "seek_data": false, 00:35:15.814 "copy": false, 00:35:15.814 "nvme_iov_md": false 00:35:15.814 }, 00:35:15.814 "memory_domains": [ 00:35:15.814 { 00:35:15.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:15.814 "dma_device_type": 2 00:35:15.814 } 00:35:15.814 ], 00:35:15.814 "driver_specific": { 00:35:15.814 "crypto": { 00:35:15.814 "base_bdev_name": "EE_base0", 00:35:15.814 "name": "crypt0", 00:35:15.814 "key_name": "test_dek_sw" 00:35:15.814 } 00:35:15.814 } 00:35:15.814 } 00:35:15.814 ] 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # rpcpid=1323570 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@656 -- # sleep 1 00:35:15.814 06:03:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:15.814 Running I/O for 5 seconds... 00:35:16.747 06:03:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:35:16.747 06:03:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:16.747 06:03:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:16.747 06:03:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:16.747 06:03:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@659 -- # wait 1323570 00:35:20.997 00:35:20.997 Latency(us) 00:35:20.997 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:20.997 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:35:20.997 crypt0 : 5.00 35942.83 140.40 0.00 0.00 886.55 418.50 1175.37 00:35:20.997 =================================================================================================================== 00:35:20.997 Total : 35942.83 140.40 0.00 0.00 886.55 418.50 1175.37 00:35:20.997 0 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@661 -- # rpc_cmd bdev_crypto_delete crypt0 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@663 -- # killprocess 1323395 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 1323395 ']' 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 1323395 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1323395 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:20.997 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1323395' 00:35:20.998 killing process with pid 1323395 00:35:20.998 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 1323395 00:35:20.998 Received shutdown signal, test time was about 5.000000 seconds 00:35:20.998 00:35:20.998 Latency(us) 00:35:20.998 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:20.998 =================================================================================================================== 00:35:20.998 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:20.998 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 1323395 00:35:21.257 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # trap - SIGINT SIGTERM EXIT 00:35:21.257 00:35:21.257 real 0m6.456s 00:35:21.257 user 0m6.732s 00:35:21.257 sys 0m0.366s 00:35:21.257 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:21.257 06:03:35 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:35:21.257 ************************************ 00:35:21.257 END TEST bdev_crypto_enomem 00:35:21.257 ************************************ 00:35:21.257 06:03:35 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:21.257 06:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:35:21.257 06:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # cleanup 00:35:21.257 06:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:21.257 06:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:21.258 06:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:35:21.258 06:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:35:21.258 06:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:35:21.258 06:03:35 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:35:21.258 00:35:21.258 real 0m53.995s 00:35:21.258 user 1m32.777s 00:35:21.258 sys 0m6.296s 00:35:21.258 06:03:35 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:21.258 06:03:35 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:21.258 ************************************ 00:35:21.258 END TEST blockdev_crypto_sw 00:35:21.258 ************************************ 00:35:21.258 06:03:35 -- common/autotest_common.sh@1142 -- # return 0 00:35:21.258 06:03:35 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:35:21.258 06:03:35 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:35:21.258 06:03:35 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:21.258 06:03:35 -- common/autotest_common.sh@10 -- # set +x 00:35:21.258 ************************************ 00:35:21.258 START TEST blockdev_crypto_qat 00:35:21.258 ************************************ 00:35:21.258 06:03:36 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:35:21.258 * Looking for test storage... 00:35:21.258 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # uname -s 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@681 -- # test_type=crypto_qat 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # crypto_device= 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # dek= 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # env_ctx= 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == bdev ]] 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == crypto_* ]] 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1324342 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:35:21.258 06:03:36 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1324342 00:35:21.258 06:03:36 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 1324342 ']' 00:35:21.258 06:03:36 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:21.258 06:03:36 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:21.258 06:03:36 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:21.258 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:21.258 06:03:36 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:21.258 06:03:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:21.518 [2024-07-26 06:03:36.225237] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:35:21.518 [2024-07-26 06:03:36.225312] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1324342 ] 00:35:21.518 [2024-07-26 06:03:36.355201] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:21.777 [2024-07-26 06:03:36.453451] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:22.345 06:03:37 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:22.345 06:03:37 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:35:22.345 06:03:37 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:35:22.345 06:03:37 blockdev_crypto_qat -- bdev/blockdev.sh@707 -- # setup_crypto_qat_conf 00:35:22.345 06:03:37 blockdev_crypto_qat -- bdev/blockdev.sh@169 -- # rpc_cmd 00:35:22.345 06:03:37 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:22.345 06:03:37 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:22.345 [2024-07-26 06:03:37.163689] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:22.345 [2024-07-26 06:03:37.171719] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:22.345 [2024-07-26 06:03:37.179736] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:22.345 [2024-07-26 06:03:37.251469] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:24.881 true 00:35:24.881 true 00:35:24.881 true 00:35:24.881 true 00:35:24.881 Malloc0 00:35:24.881 Malloc1 00:35:24.881 Malloc2 00:35:24.881 Malloc3 00:35:24.881 [2024-07-26 06:03:39.619689] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:24.881 crypto_ram 00:35:24.881 [2024-07-26 06:03:39.627694] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:24.881 crypto_ram1 00:35:24.881 [2024-07-26 06:03:39.635708] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:24.881 crypto_ram2 00:35:24.881 [2024-07-26 06:03:39.643727] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:24.881 crypto_ram3 00:35:24.881 [ 00:35:24.881 { 00:35:24.881 "name": "Malloc1", 00:35:24.881 "aliases": [ 00:35:24.881 "232f9747-9b3d-4734-a85e-5691ed333074" 00:35:24.881 ], 00:35:24.881 "product_name": "Malloc disk", 00:35:24.881 "block_size": 512, 00:35:24.881 "num_blocks": 65536, 00:35:24.881 "uuid": "232f9747-9b3d-4734-a85e-5691ed333074", 00:35:24.881 "assigned_rate_limits": { 00:35:24.881 "rw_ios_per_sec": 0, 00:35:24.881 "rw_mbytes_per_sec": 0, 00:35:24.881 "r_mbytes_per_sec": 0, 00:35:24.881 "w_mbytes_per_sec": 0 00:35:24.881 }, 00:35:24.881 "claimed": true, 00:35:24.881 "claim_type": "exclusive_write", 00:35:24.881 "zoned": false, 00:35:24.881 "supported_io_types": { 00:35:24.881 "read": true, 00:35:24.881 "write": true, 00:35:24.881 "unmap": true, 00:35:24.881 "flush": true, 00:35:24.881 "reset": true, 00:35:24.881 "nvme_admin": false, 00:35:24.881 "nvme_io": false, 00:35:24.881 "nvme_io_md": false, 00:35:24.881 "write_zeroes": true, 00:35:24.881 "zcopy": true, 00:35:24.881 "get_zone_info": false, 00:35:24.881 "zone_management": false, 00:35:24.881 "zone_append": false, 00:35:24.881 "compare": false, 00:35:24.881 "compare_and_write": false, 00:35:24.881 "abort": true, 00:35:24.881 "seek_hole": false, 00:35:24.881 "seek_data": false, 00:35:24.881 "copy": true, 00:35:24.881 "nvme_iov_md": false 00:35:24.881 }, 00:35:24.881 "memory_domains": [ 00:35:24.881 { 00:35:24.881 "dma_device_id": "system", 00:35:24.881 "dma_device_type": 1 00:35:24.881 }, 00:35:24.881 { 00:35:24.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:24.881 "dma_device_type": 2 00:35:24.881 } 00:35:24.881 ], 00:35:24.881 "driver_specific": {} 00:35:24.881 } 00:35:24.881 ] 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.881 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.881 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # cat 00:35:24.881 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.881 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.881 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.881 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:35:24.881 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.881 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:24.881 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.141 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:35:25.141 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r .name 00:35:25.141 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "62197cec-def0-5421-b4d2-12dd60f43734"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "62197cec-def0-5421-b4d2-12dd60f43734",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "ee0222af-942a-5009-a776-2977ad0ee39d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ee0222af-942a-5009-a776-2977ad0ee39d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "9636b16c-2ed6-5c97-bca3-b867452af650"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9636b16c-2ed6-5c97-bca3-b867452af650",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "76e8e2a8-7d1f-5063-9e4c-5d57659a6096"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "76e8e2a8-7d1f-5063-9e4c-5d57659a6096",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:25.141 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:35:25.141 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:35:25.141 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:35:25.141 06:03:39 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # killprocess 1324342 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 1324342 ']' 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 1324342 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1324342 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1324342' 00:35:25.141 killing process with pid 1324342 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 1324342 00:35:25.141 06:03:39 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 1324342 00:35:25.708 06:03:40 blockdev_crypto_qat -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:35:25.708 06:03:40 blockdev_crypto_qat -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:35:25.708 06:03:40 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:35:25.708 06:03:40 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:25.708 06:03:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:25.708 ************************************ 00:35:25.708 START TEST bdev_hello_world 00:35:25.708 ************************************ 00:35:25.709 06:03:40 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:35:25.709 [2024-07-26 06:03:40.530234] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:35:25.709 [2024-07-26 06:03:40.530294] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1324893 ] 00:35:25.967 [2024-07-26 06:03:40.659873] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:25.967 [2024-07-26 06:03:40.759975] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:25.967 [2024-07-26 06:03:40.781332] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:25.967 [2024-07-26 06:03:40.789362] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:25.967 [2024-07-26 06:03:40.797378] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:26.224 [2024-07-26 06:03:40.907704] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:28.755 [2024-07-26 06:03:43.115860] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:28.755 [2024-07-26 06:03:43.115929] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:28.755 [2024-07-26 06:03:43.115944] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:28.755 [2024-07-26 06:03:43.123881] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:28.755 [2024-07-26 06:03:43.123903] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:28.755 [2024-07-26 06:03:43.123914] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:28.755 [2024-07-26 06:03:43.131900] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:28.755 [2024-07-26 06:03:43.131919] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:28.755 [2024-07-26 06:03:43.131931] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:28.755 [2024-07-26 06:03:43.139922] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:28.755 [2024-07-26 06:03:43.139946] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:28.755 [2024-07-26 06:03:43.139957] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:28.755 [2024-07-26 06:03:43.217539] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:35:28.755 [2024-07-26 06:03:43.217584] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:35:28.755 [2024-07-26 06:03:43.217603] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:35:28.755 [2024-07-26 06:03:43.218880] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:35:28.755 [2024-07-26 06:03:43.218950] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:35:28.755 [2024-07-26 06:03:43.218967] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:35:28.755 [2024-07-26 06:03:43.219014] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:35:28.755 00:35:28.755 [2024-07-26 06:03:43.219034] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:35:28.755 00:35:28.755 real 0m3.167s 00:35:28.755 user 0m2.744s 00:35:28.755 sys 0m0.389s 00:35:28.755 06:03:43 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:28.755 06:03:43 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:35:28.755 ************************************ 00:35:28.755 END TEST bdev_hello_world 00:35:28.755 ************************************ 00:35:29.013 06:03:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:29.013 06:03:43 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:35:29.013 06:03:43 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:35:29.013 06:03:43 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:29.013 06:03:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:29.013 ************************************ 00:35:29.013 START TEST bdev_bounds 00:35:29.013 ************************************ 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1325263 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1325263' 00:35:29.013 Process bdevio pid: 1325263 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1325263 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1325263 ']' 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:29.013 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:35:29.013 06:03:43 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:35:29.013 [2024-07-26 06:03:43.769932] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:35:29.013 [2024-07-26 06:03:43.769996] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1325263 ] 00:35:29.013 [2024-07-26 06:03:43.899542] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:35:29.271 [2024-07-26 06:03:44.008363] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:29.271 [2024-07-26 06:03:44.008450] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:35:29.271 [2024-07-26 06:03:44.008455] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:29.271 [2024-07-26 06:03:44.029868] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:29.271 [2024-07-26 06:03:44.037898] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:29.271 [2024-07-26 06:03:44.045917] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:29.272 [2024-07-26 06:03:44.148022] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:31.802 [2024-07-26 06:03:46.357006] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:31.802 [2024-07-26 06:03:46.357089] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:31.802 [2024-07-26 06:03:46.357104] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:31.802 [2024-07-26 06:03:46.365021] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:31.802 [2024-07-26 06:03:46.365042] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:31.802 [2024-07-26 06:03:46.365055] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:31.802 [2024-07-26 06:03:46.373044] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:31.802 [2024-07-26 06:03:46.373062] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:31.802 [2024-07-26 06:03:46.373074] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:31.802 [2024-07-26 06:03:46.381069] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:31.802 [2024-07-26 06:03:46.381089] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:31.802 [2024-07-26 06:03:46.381101] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:31.802 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:31.802 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:35:31.802 06:03:46 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:35:31.802 I/O targets: 00:35:31.802 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:35:31.802 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:35:31.802 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:35:31.802 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:35:31.802 00:35:31.802 00:35:31.802 CUnit - A unit testing framework for C - Version 2.1-3 00:35:31.803 http://cunit.sourceforge.net/ 00:35:31.803 00:35:31.803 00:35:31.803 Suite: bdevio tests on: crypto_ram3 00:35:31.803 Test: blockdev write read block ...passed 00:35:31.803 Test: blockdev write zeroes read block ...passed 00:35:31.803 Test: blockdev write zeroes read no split ...passed 00:35:31.803 Test: blockdev write zeroes read split ...passed 00:35:31.803 Test: blockdev write zeroes read split partial ...passed 00:35:31.803 Test: blockdev reset ...passed 00:35:31.803 Test: blockdev write read 8 blocks ...passed 00:35:31.803 Test: blockdev write read size > 128k ...passed 00:35:31.803 Test: blockdev write read invalid size ...passed 00:35:31.803 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:31.803 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:31.803 Test: blockdev write read max offset ...passed 00:35:31.803 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:31.803 Test: blockdev writev readv 8 blocks ...passed 00:35:31.803 Test: blockdev writev readv 30 x 1block ...passed 00:35:31.803 Test: blockdev writev readv block ...passed 00:35:31.803 Test: blockdev writev readv size > 128k ...passed 00:35:31.803 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:31.803 Test: blockdev comparev and writev ...passed 00:35:31.803 Test: blockdev nvme passthru rw ...passed 00:35:31.803 Test: blockdev nvme passthru vendor specific ...passed 00:35:31.803 Test: blockdev nvme admin passthru ...passed 00:35:31.803 Test: blockdev copy ...passed 00:35:31.803 Suite: bdevio tests on: crypto_ram2 00:35:31.803 Test: blockdev write read block ...passed 00:35:31.803 Test: blockdev write zeroes read block ...passed 00:35:31.803 Test: blockdev write zeroes read no split ...passed 00:35:31.803 Test: blockdev write zeroes read split ...passed 00:35:31.803 Test: blockdev write zeroes read split partial ...passed 00:35:31.803 Test: blockdev reset ...passed 00:35:31.803 Test: blockdev write read 8 blocks ...passed 00:35:31.803 Test: blockdev write read size > 128k ...passed 00:35:31.803 Test: blockdev write read invalid size ...passed 00:35:31.803 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:31.803 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:31.803 Test: blockdev write read max offset ...passed 00:35:31.803 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:31.803 Test: blockdev writev readv 8 blocks ...passed 00:35:31.803 Test: blockdev writev readv 30 x 1block ...passed 00:35:31.803 Test: blockdev writev readv block ...passed 00:35:31.803 Test: blockdev writev readv size > 128k ...passed 00:35:31.803 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:31.803 Test: blockdev comparev and writev ...passed 00:35:31.803 Test: blockdev nvme passthru rw ...passed 00:35:31.803 Test: blockdev nvme passthru vendor specific ...passed 00:35:31.803 Test: blockdev nvme admin passthru ...passed 00:35:31.803 Test: blockdev copy ...passed 00:35:31.803 Suite: bdevio tests on: crypto_ram1 00:35:31.803 Test: blockdev write read block ...passed 00:35:31.803 Test: blockdev write zeroes read block ...passed 00:35:31.803 Test: blockdev write zeroes read no split ...passed 00:35:32.062 Test: blockdev write zeroes read split ...passed 00:35:32.062 Test: blockdev write zeroes read split partial ...passed 00:35:32.062 Test: blockdev reset ...passed 00:35:32.062 Test: blockdev write read 8 blocks ...passed 00:35:32.062 Test: blockdev write read size > 128k ...passed 00:35:32.062 Test: blockdev write read invalid size ...passed 00:35:32.062 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:32.062 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:32.062 Test: blockdev write read max offset ...passed 00:35:32.062 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:32.062 Test: blockdev writev readv 8 blocks ...passed 00:35:32.062 Test: blockdev writev readv 30 x 1block ...passed 00:35:32.062 Test: blockdev writev readv block ...passed 00:35:32.062 Test: blockdev writev readv size > 128k ...passed 00:35:32.062 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:32.062 Test: blockdev comparev and writev ...passed 00:35:32.062 Test: blockdev nvme passthru rw ...passed 00:35:32.062 Test: blockdev nvme passthru vendor specific ...passed 00:35:32.062 Test: blockdev nvme admin passthru ...passed 00:35:32.062 Test: blockdev copy ...passed 00:35:32.062 Suite: bdevio tests on: crypto_ram 00:35:32.062 Test: blockdev write read block ...passed 00:35:32.062 Test: blockdev write zeroes read block ...passed 00:35:32.062 Test: blockdev write zeroes read no split ...passed 00:35:32.062 Test: blockdev write zeroes read split ...passed 00:35:32.062 Test: blockdev write zeroes read split partial ...passed 00:35:32.062 Test: blockdev reset ...passed 00:35:32.062 Test: blockdev write read 8 blocks ...passed 00:35:32.062 Test: blockdev write read size > 128k ...passed 00:35:32.062 Test: blockdev write read invalid size ...passed 00:35:32.062 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:32.062 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:32.062 Test: blockdev write read max offset ...passed 00:35:32.062 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:32.062 Test: blockdev writev readv 8 blocks ...passed 00:35:32.062 Test: blockdev writev readv 30 x 1block ...passed 00:35:32.062 Test: blockdev writev readv block ...passed 00:35:32.062 Test: blockdev writev readv size > 128k ...passed 00:35:32.062 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:32.062 Test: blockdev comparev and writev ...passed 00:35:32.062 Test: blockdev nvme passthru rw ...passed 00:35:32.062 Test: blockdev nvme passthru vendor specific ...passed 00:35:32.062 Test: blockdev nvme admin passthru ...passed 00:35:32.062 Test: blockdev copy ...passed 00:35:32.062 00:35:32.062 Run Summary: Type Total Ran Passed Failed Inactive 00:35:32.062 suites 4 4 n/a 0 0 00:35:32.062 tests 92 92 92 0 0 00:35:32.062 asserts 520 520 520 0 n/a 00:35:32.062 00:35:32.062 Elapsed time = 0.520 seconds 00:35:32.062 0 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1325263 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1325263 ']' 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1325263 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1325263 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1325263' 00:35:32.062 killing process with pid 1325263 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1325263 00:35:32.062 06:03:46 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1325263 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:35:32.630 00:35:32.630 real 0m3.654s 00:35:32.630 user 0m10.213s 00:35:32.630 sys 0m0.578s 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:35:32.630 ************************************ 00:35:32.630 END TEST bdev_bounds 00:35:32.630 ************************************ 00:35:32.630 06:03:47 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:32.630 06:03:47 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:35:32.630 06:03:47 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:35:32.630 06:03:47 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:32.630 06:03:47 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:32.630 ************************************ 00:35:32.630 START TEST bdev_nbd 00:35:32.630 ************************************ 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1325812 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1325812 /var/tmp/spdk-nbd.sock 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1325812 ']' 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:35:32.630 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:32.630 06:03:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:35:32.630 [2024-07-26 06:03:47.515716] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:35:32.630 [2024-07-26 06:03:47.515767] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:32.889 [2024-07-26 06:03:47.629459] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:32.889 [2024-07-26 06:03:47.729611] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:32.889 [2024-07-26 06:03:47.751032] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:32.889 [2024-07-26 06:03:47.759055] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:32.889 [2024-07-26 06:03:47.767073] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:33.147 [2024-07-26 06:03:47.872708] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:35.680 [2024-07-26 06:03:50.079302] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:35.680 [2024-07-26 06:03:50.079370] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:35.680 [2024-07-26 06:03:50.079386] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:35.680 [2024-07-26 06:03:50.087319] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:35.680 [2024-07-26 06:03:50.087340] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:35.680 [2024-07-26 06:03:50.087353] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:35.680 [2024-07-26 06:03:50.095341] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:35.680 [2024-07-26 06:03:50.095363] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:35.680 [2024-07-26 06:03:50.095374] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:35.680 [2024-07-26 06:03:50.103359] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:35.680 [2024-07-26 06:03:50.103378] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:35.680 [2024-07-26 06:03:50.103389] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:35.680 1+0 records in 00:35:35.680 1+0 records out 00:35:35.680 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295145 s, 13.9 MB/s 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:35.680 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:35.939 1+0 records in 00:35:35.939 1+0 records out 00:35:35.939 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324532 s, 12.6 MB/s 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:35.939 06:03:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:36.197 1+0 records in 00:35:36.197 1+0 records out 00:35:36.197 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290158 s, 14.1 MB/s 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:36.197 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:36.456 1+0 records in 00:35:36.456 1+0 records out 00:35:36.456 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000408295 s, 10.0 MB/s 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:35:36.456 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:36.714 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:35:36.714 { 00:35:36.714 "nbd_device": "/dev/nbd0", 00:35:36.714 "bdev_name": "crypto_ram" 00:35:36.714 }, 00:35:36.714 { 00:35:36.714 "nbd_device": "/dev/nbd1", 00:35:36.714 "bdev_name": "crypto_ram1" 00:35:36.714 }, 00:35:36.714 { 00:35:36.714 "nbd_device": "/dev/nbd2", 00:35:36.714 "bdev_name": "crypto_ram2" 00:35:36.714 }, 00:35:36.714 { 00:35:36.714 "nbd_device": "/dev/nbd3", 00:35:36.714 "bdev_name": "crypto_ram3" 00:35:36.714 } 00:35:36.714 ]' 00:35:36.714 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:35:36.714 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:35:36.714 { 00:35:36.714 "nbd_device": "/dev/nbd0", 00:35:36.714 "bdev_name": "crypto_ram" 00:35:36.714 }, 00:35:36.714 { 00:35:36.714 "nbd_device": "/dev/nbd1", 00:35:36.714 "bdev_name": "crypto_ram1" 00:35:36.714 }, 00:35:36.714 { 00:35:36.714 "nbd_device": "/dev/nbd2", 00:35:36.714 "bdev_name": "crypto_ram2" 00:35:36.714 }, 00:35:36.714 { 00:35:36.714 "nbd_device": "/dev/nbd3", 00:35:36.714 "bdev_name": "crypto_ram3" 00:35:36.714 } 00:35:36.714 ]' 00:35:36.714 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:35:36.972 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:35:36.972 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:36.972 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:35:36.972 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:36.972 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:36.972 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:36.972 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:37.231 06:03:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:37.231 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:37.489 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:35:37.747 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:35:37.747 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:35:37.747 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:35:37.747 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:37.747 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:37.747 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:35:37.747 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:35:38.005 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:35:38.005 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:38.005 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:35:38.005 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:38.005 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:38.005 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:38.005 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:38.005 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:38.263 06:03:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:38.263 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:35:38.572 /dev/nbd0 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:38.572 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:38.572 1+0 records in 00:35:38.572 1+0 records out 00:35:38.572 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000316622 s, 12.9 MB/s 00:35:38.573 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:38.573 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:38.573 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:38.573 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:38.573 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:38.573 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:38.573 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:38.573 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:35:38.844 /dev/nbd1 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:38.844 1+0 records in 00:35:38.844 1+0 records out 00:35:38.844 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00034731 s, 11.8 MB/s 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:38.844 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:35:39.103 /dev/nbd10 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:39.103 1+0 records in 00:35:39.103 1+0 records out 00:35:39.103 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333071 s, 12.3 MB/s 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:39.103 06:03:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:35:39.362 /dev/nbd11 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:39.362 1+0 records in 00:35:39.362 1+0 records out 00:35:39.362 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000329648 s, 12.4 MB/s 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:39.362 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:35:39.621 { 00:35:39.621 "nbd_device": "/dev/nbd0", 00:35:39.621 "bdev_name": "crypto_ram" 00:35:39.621 }, 00:35:39.621 { 00:35:39.621 "nbd_device": "/dev/nbd1", 00:35:39.621 "bdev_name": "crypto_ram1" 00:35:39.621 }, 00:35:39.621 { 00:35:39.621 "nbd_device": "/dev/nbd10", 00:35:39.621 "bdev_name": "crypto_ram2" 00:35:39.621 }, 00:35:39.621 { 00:35:39.621 "nbd_device": "/dev/nbd11", 00:35:39.621 "bdev_name": "crypto_ram3" 00:35:39.621 } 00:35:39.621 ]' 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:35:39.621 { 00:35:39.621 "nbd_device": "/dev/nbd0", 00:35:39.621 "bdev_name": "crypto_ram" 00:35:39.621 }, 00:35:39.621 { 00:35:39.621 "nbd_device": "/dev/nbd1", 00:35:39.621 "bdev_name": "crypto_ram1" 00:35:39.621 }, 00:35:39.621 { 00:35:39.621 "nbd_device": "/dev/nbd10", 00:35:39.621 "bdev_name": "crypto_ram2" 00:35:39.621 }, 00:35:39.621 { 00:35:39.621 "nbd_device": "/dev/nbd11", 00:35:39.621 "bdev_name": "crypto_ram3" 00:35:39.621 } 00:35:39.621 ]' 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:35:39.621 /dev/nbd1 00:35:39.621 /dev/nbd10 00:35:39.621 /dev/nbd11' 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:35:39.621 /dev/nbd1 00:35:39.621 /dev/nbd10 00:35:39.621 /dev/nbd11' 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:35:39.621 256+0 records in 00:35:39.621 256+0 records out 00:35:39.621 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00672282 s, 156 MB/s 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:39.621 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:35:39.880 256+0 records in 00:35:39.880 256+0 records out 00:35:39.880 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0820611 s, 12.8 MB/s 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:35:39.880 256+0 records in 00:35:39.880 256+0 records out 00:35:39.880 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0565387 s, 18.5 MB/s 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:35:39.880 256+0 records in 00:35:39.880 256+0 records out 00:35:39.880 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0576251 s, 18.2 MB/s 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:35:39.880 256+0 records in 00:35:39.880 256+0 records out 00:35:39.880 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0331617 s, 31.6 MB/s 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:35:39.880 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:40.139 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:40.139 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:40.139 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:40.139 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:40.139 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:40.139 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:40.139 06:03:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:40.139 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:40.397 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:40.397 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:40.397 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:40.397 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:40.397 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:40.397 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:40.397 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:40.397 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:40.397 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:40.655 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:40.914 06:03:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:41.172 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:35:41.172 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:35:41.172 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:41.172 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:35:41.172 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:35:41.172 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:35:41.431 malloc_lvol_verify 00:35:41.431 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:35:41.690 dbeb07a7-aa2d-40cd-884d-bdb65c3c910f 00:35:41.690 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:35:41.949 fabc2cba-1239-4cc6-aca0-21e14e20cfca 00:35:41.949 06:03:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:35:42.208 /dev/nbd0 00:35:42.208 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:35:42.208 mke2fs 1.46.5 (30-Dec-2021) 00:35:42.208 Discarding device blocks: 0/4096 done 00:35:42.208 Creating filesystem with 4096 1k blocks and 1024 inodes 00:35:42.208 00:35:42.208 Allocating group tables: 0/1 done 00:35:42.208 Writing inode tables: 0/1 done 00:35:42.208 Creating journal (1024 blocks): done 00:35:42.208 Writing superblocks and filesystem accounting information: 0/1 done 00:35:42.208 00:35:42.208 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:35:42.208 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:35:42.208 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:42.208 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:35:42.208 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:42.208 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:42.208 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:42.208 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1325812 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1325812 ']' 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1325812 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:42.467 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1325812 00:35:42.726 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:42.726 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:42.726 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1325812' 00:35:42.726 killing process with pid 1325812 00:35:42.726 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1325812 00:35:42.726 06:03:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1325812 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:35:43.293 00:35:43.293 real 0m10.598s 00:35:43.293 user 0m13.788s 00:35:43.293 sys 0m4.011s 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:35:43.293 ************************************ 00:35:43.293 END TEST bdev_nbd 00:35:43.293 ************************************ 00:35:43.293 06:03:58 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:43.293 06:03:58 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:35:43.293 06:03:58 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = nvme ']' 00:35:43.293 06:03:58 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = gpt ']' 00:35:43.293 06:03:58 blockdev_crypto_qat -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:35:43.293 06:03:58 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:35:43.293 06:03:58 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:43.293 06:03:58 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:43.293 ************************************ 00:35:43.293 START TEST bdev_fio 00:35:43.293 ************************************ 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:43.293 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:35:43.293 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram1]' 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram1 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:43.552 ************************************ 00:35:43.552 START TEST bdev_fio_rw_verify 00:35:43.552 ************************************ 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:43.552 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:43.553 06:03:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:43.811 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:43.811 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:43.811 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:43.811 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:43.811 fio-3.35 00:35:43.811 Starting 4 threads 00:35:58.685 00:35:58.685 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1327850: Fri Jul 26 06:04:11 2024 00:35:58.685 read: IOPS=20.3k, BW=79.3MiB/s (83.1MB/s)(793MiB/10001msec) 00:35:58.685 slat (usec): min=11, max=1535, avg=69.10, stdev=35.41 00:35:58.686 clat (usec): min=18, max=2406, avg=375.45, stdev=228.60 00:35:58.686 lat (usec): min=48, max=2534, avg=444.55, stdev=245.78 00:35:58.686 clat percentiles (usec): 00:35:58.686 | 50.000th=[ 318], 99.000th=[ 1057], 99.900th=[ 1287], 99.990th=[ 1565], 00:35:58.686 | 99.999th=[ 2245] 00:35:58.686 write: IOPS=22.3k, BW=87.1MiB/s (91.4MB/s)(849MiB/9741msec); 0 zone resets 00:35:58.686 slat (usec): min=18, max=346, avg=79.79, stdev=34.24 00:35:58.686 clat (usec): min=22, max=1731, avg=412.23, stdev=239.94 00:35:58.686 lat (usec): min=52, max=1969, avg=492.02, stdev=256.30 00:35:58.686 clat percentiles (usec): 00:35:58.686 | 50.000th=[ 363], 99.000th=[ 1123], 99.900th=[ 1418], 99.990th=[ 1549], 00:35:58.686 | 99.999th=[ 1680] 00:35:58.686 bw ( KiB/s): min=64944, max=116412, per=97.69%, avg=87159.79, stdev=3939.88, samples=76 00:35:58.686 iops : min=16236, max=29102, avg=21789.89, stdev=984.94, samples=76 00:35:58.686 lat (usec) : 20=0.01%, 50=0.02%, 100=3.23%, 250=30.30%, 500=38.22% 00:35:58.686 lat (usec) : 750=19.28%, 1000=7.13% 00:35:58.686 lat (msec) : 2=1.80%, 4=0.01% 00:35:58.686 cpu : usr=99.62%, sys=0.00%, ctx=54, majf=0, minf=254 00:35:58.686 IO depths : 1=5.9%, 2=26.9%, 4=53.8%, 8=13.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:58.686 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:58.686 complete : 0=0.0%, 4=88.2%, 8=11.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:58.686 issued rwts: total=202963,217277,0,0 short=0,0,0,0 dropped=0,0,0,0 00:35:58.686 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:58.686 00:35:58.686 Run status group 0 (all jobs): 00:35:58.686 READ: bw=79.3MiB/s (83.1MB/s), 79.3MiB/s-79.3MiB/s (83.1MB/s-83.1MB/s), io=793MiB (831MB), run=10001-10001msec 00:35:58.686 WRITE: bw=87.1MiB/s (91.4MB/s), 87.1MiB/s-87.1MiB/s (91.4MB/s-91.4MB/s), io=849MiB (890MB), run=9741-9741msec 00:35:58.686 00:35:58.686 real 0m13.440s 00:35:58.686 user 0m45.756s 00:35:58.686 sys 0m0.469s 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:35:58.686 ************************************ 00:35:58.686 END TEST bdev_fio_rw_verify 00:35:58.686 ************************************ 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "62197cec-def0-5421-b4d2-12dd60f43734"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "62197cec-def0-5421-b4d2-12dd60f43734",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "ee0222af-942a-5009-a776-2977ad0ee39d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ee0222af-942a-5009-a776-2977ad0ee39d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "9636b16c-2ed6-5c97-bca3-b867452af650"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9636b16c-2ed6-5c97-bca3-b867452af650",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "76e8e2a8-7d1f-5063-9e4c-5d57659a6096"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "76e8e2a8-7d1f-5063-9e4c-5d57659a6096",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:35:58.686 crypto_ram1 00:35:58.686 crypto_ram2 00:35:58.686 crypto_ram3 ]] 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "62197cec-def0-5421-b4d2-12dd60f43734"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "62197cec-def0-5421-b4d2-12dd60f43734",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "ee0222af-942a-5009-a776-2977ad0ee39d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ee0222af-942a-5009-a776-2977ad0ee39d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "9636b16c-2ed6-5c97-bca3-b867452af650"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9636b16c-2ed6-5c97-bca3-b867452af650",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "76e8e2a8-7d1f-5063-9e4c-5d57659a6096"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "76e8e2a8-7d1f-5063-9e4c-5d57659a6096",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram1]' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram1 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:58.686 ************************************ 00:35:58.686 START TEST bdev_fio_trim 00:35:58.686 ************************************ 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:35:58.686 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:58.687 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:58.687 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:58.687 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:58.687 06:04:11 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:58.687 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:58.687 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:58.687 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:58.687 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:58.687 fio-3.35 00:35:58.687 Starting 4 threads 00:36:10.899 00:36:10.899 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1329701: Fri Jul 26 06:04:24 2024 00:36:10.899 write: IOPS=27.3k, BW=107MiB/s (112MB/s)(1065MiB/10001msec); 0 zone resets 00:36:10.899 slat (usec): min=19, max=524, avg=87.07, stdev=42.88 00:36:10.899 clat (usec): min=42, max=1969, avg=304.47, stdev=166.66 00:36:10.899 lat (usec): min=72, max=2077, avg=391.54, stdev=189.93 00:36:10.899 clat percentiles (usec): 00:36:10.899 | 50.000th=[ 281], 99.000th=[ 832], 99.900th=[ 889], 99.990th=[ 963], 00:36:10.899 | 99.999th=[ 1713] 00:36:10.899 bw ( KiB/s): min=105472, max=148640, per=100.00%, avg=109120.00, stdev=2963.19, samples=76 00:36:10.899 iops : min=26368, max=37160, avg=27280.00, stdev=740.80, samples=76 00:36:10.899 trim: IOPS=27.3k, BW=107MiB/s (112MB/s)(1065MiB/10001msec); 0 zone resets 00:36:10.899 slat (usec): min=5, max=445, avg=23.74, stdev= 9.15 00:36:10.899 clat (usec): min=24, max=2077, avg=391.76, stdev=190.00 00:36:10.899 lat (usec): min=41, max=2102, avg=415.50, stdev=192.21 00:36:10.899 clat percentiles (usec): 00:36:10.899 | 50.000th=[ 363], 99.000th=[ 1020], 99.900th=[ 1090], 99.990th=[ 1172], 00:36:10.899 | 99.999th=[ 1860] 00:36:10.899 bw ( KiB/s): min=105472, max=148640, per=100.00%, avg=109120.00, stdev=2963.22, samples=76 00:36:10.899 iops : min=26368, max=37160, avg=27280.00, stdev=740.80, samples=76 00:36:10.899 lat (usec) : 50=0.06%, 100=3.44%, 250=30.74%, 500=49.26%, 750=12.03% 00:36:10.899 lat (usec) : 1000=3.78% 00:36:10.899 lat (msec) : 2=0.68%, 4=0.01% 00:36:10.899 cpu : usr=99.50%, sys=0.01%, ctx=105, majf=0, minf=95 00:36:10.899 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:36:10.899 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:10.899 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:10.899 issued rwts: total=0,272702,272704,0 short=0,0,0,0 dropped=0,0,0,0 00:36:10.899 latency : target=0, window=0, percentile=100.00%, depth=8 00:36:10.899 00:36:10.899 Run status group 0 (all jobs): 00:36:10.899 WRITE: bw=107MiB/s (112MB/s), 107MiB/s-107MiB/s (112MB/s-112MB/s), io=1065MiB (1117MB), run=10001-10001msec 00:36:10.899 TRIM: bw=107MiB/s (112MB/s), 107MiB/s-107MiB/s (112MB/s-112MB/s), io=1065MiB (1117MB), run=10001-10001msec 00:36:10.899 00:36:10.899 real 0m13.518s 00:36:10.899 user 0m45.644s 00:36:10.899 sys 0m0.499s 00:36:10.899 06:04:25 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:10.899 06:04:25 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:36:10.899 ************************************ 00:36:10.899 END TEST bdev_fio_trim 00:36:10.899 ************************************ 00:36:10.899 06:04:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:36:10.899 06:04:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:36:10.900 06:04:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:10.900 06:04:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:36:10.900 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:10.900 06:04:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:36:10.900 00:36:10.900 real 0m27.309s 00:36:10.900 user 1m31.563s 00:36:10.900 sys 0m1.180s 00:36:10.900 06:04:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:10.900 06:04:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:36:10.900 ************************************ 00:36:10.900 END TEST bdev_fio 00:36:10.900 ************************************ 00:36:10.900 06:04:25 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:10.900 06:04:25 blockdev_crypto_qat -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:36:10.900 06:04:25 blockdev_crypto_qat -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:36:10.900 06:04:25 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:36:10.900 06:04:25 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:10.900 06:04:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:10.900 ************************************ 00:36:10.900 START TEST bdev_verify 00:36:10.900 ************************************ 00:36:10.900 06:04:25 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:36:10.900 [2024-07-26 06:04:25.585529] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:10.900 [2024-07-26 06:04:25.585595] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1330972 ] 00:36:10.900 [2024-07-26 06:04:25.716720] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:36:11.159 [2024-07-26 06:04:25.820214] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:11.159 [2024-07-26 06:04:25.820219] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:11.159 [2024-07-26 06:04:25.841644] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:11.159 [2024-07-26 06:04:25.849678] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:11.159 [2024-07-26 06:04:25.857697] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:11.159 [2024-07-26 06:04:25.977668] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:13.706 [2024-07-26 06:04:28.186797] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:13.706 [2024-07-26 06:04:28.186874] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:13.706 [2024-07-26 06:04:28.186890] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:13.706 [2024-07-26 06:04:28.194815] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:13.706 [2024-07-26 06:04:28.194834] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:13.706 [2024-07-26 06:04:28.194845] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:13.706 [2024-07-26 06:04:28.202839] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:13.706 [2024-07-26 06:04:28.202863] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:13.706 [2024-07-26 06:04:28.202875] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:13.706 [2024-07-26 06:04:28.210862] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:13.706 [2024-07-26 06:04:28.210880] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:13.706 [2024-07-26 06:04:28.210891] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:13.706 Running I/O for 5 seconds... 00:36:18.988 00:36:18.988 Latency(us) 00:36:18.988 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:18.988 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:18.988 Verification LBA range: start 0x0 length 0x1000 00:36:18.988 crypto_ram : 5.08 504.10 1.97 0.00 0.00 253328.82 4074.63 172331.19 00:36:18.988 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:18.988 Verification LBA range: start 0x1000 length 0x1000 00:36:18.988 crypto_ram : 5.07 504.48 1.97 0.00 0.00 253127.61 5157.40 171419.38 00:36:18.988 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:18.988 Verification LBA range: start 0x0 length 0x1000 00:36:18.988 crypto_ram1 : 5.08 503.99 1.97 0.00 0.00 252627.60 4473.54 163213.13 00:36:18.988 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:18.988 Verification LBA range: start 0x1000 length 0x1000 00:36:18.988 crypto_ram1 : 5.08 504.37 1.97 0.00 0.00 252423.23 5556.31 161389.52 00:36:18.988 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:18.988 Verification LBA range: start 0x0 length 0x1000 00:36:18.989 crypto_ram2 : 5.06 3886.70 15.18 0.00 0.00 32624.48 4188.61 26214.40 00:36:18.989 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:18.989 Verification LBA range: start 0x1000 length 0x1000 00:36:18.989 crypto_ram2 : 5.06 3920.72 15.32 0.00 0.00 32369.24 6240.17 26214.40 00:36:18.989 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:18.989 Verification LBA range: start 0x0 length 0x1000 00:36:18.989 crypto_ram3 : 5.06 3893.91 15.21 0.00 0.00 32496.11 3932.16 25872.47 00:36:18.989 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:18.989 Verification LBA range: start 0x1000 length 0x1000 00:36:18.989 crypto_ram3 : 5.06 3919.19 15.31 0.00 0.00 32278.70 6354.14 26100.42 00:36:18.989 =================================================================================================================== 00:36:18.989 Total : 17637.47 68.90 0.00 0.00 57724.49 3932.16 172331.19 00:36:18.989 00:36:18.989 real 0m8.304s 00:36:18.989 user 0m15.721s 00:36:18.989 sys 0m0.392s 00:36:18.989 06:04:33 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:18.989 06:04:33 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:36:18.989 ************************************ 00:36:18.989 END TEST bdev_verify 00:36:18.989 ************************************ 00:36:18.989 06:04:33 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:18.989 06:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:36:18.989 06:04:33 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:36:18.989 06:04:33 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:18.989 06:04:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:19.246 ************************************ 00:36:19.246 START TEST bdev_verify_big_io 00:36:19.246 ************************************ 00:36:19.246 06:04:33 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:36:19.246 [2024-07-26 06:04:33.967866] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:19.246 [2024-07-26 06:04:33.967928] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1332031 ] 00:36:19.246 [2024-07-26 06:04:34.087672] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:36:19.503 [2024-07-26 06:04:34.191035] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:19.503 [2024-07-26 06:04:34.191042] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:19.503 [2024-07-26 06:04:34.212478] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:19.503 [2024-07-26 06:04:34.220506] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:19.503 [2024-07-26 06:04:34.228528] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:19.503 [2024-07-26 06:04:34.334068] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:22.029 [2024-07-26 06:04:36.543354] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:22.029 [2024-07-26 06:04:36.543436] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:22.029 [2024-07-26 06:04:36.543450] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:22.029 [2024-07-26 06:04:36.551375] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:22.029 [2024-07-26 06:04:36.551393] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:22.029 [2024-07-26 06:04:36.551405] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:22.029 [2024-07-26 06:04:36.559397] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:22.029 [2024-07-26 06:04:36.559415] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:22.029 [2024-07-26 06:04:36.559426] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:22.029 [2024-07-26 06:04:36.567421] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:22.029 [2024-07-26 06:04:36.567438] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:22.029 [2024-07-26 06:04:36.567450] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:22.029 Running I/O for 5 seconds... 00:36:22.596 [2024-07-26 06:04:37.426960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.596 [2024-07-26 06:04:37.427402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.596 [2024-07-26 06:04:37.427790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.596 [2024-07-26 06:04:37.428163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.596 [2024-07-26 06:04:37.428238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.596 [2024-07-26 06:04:37.428286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.596 [2024-07-26 06:04:37.428328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.596 [2024-07-26 06:04:37.428369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.596 [2024-07-26 06:04:37.428802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.428826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.428841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.428860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.432278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.432327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.432382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.432425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.432918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.432964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.433009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.433051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.433484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.433505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.433521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.433536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.436863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.436909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.436950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.436992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.437467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.437514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.437555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.437597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.438036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.438054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.438068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.438082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.441450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.441497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.441539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.441585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.442064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.442110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.442152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.442194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.442552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.442569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.442584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.442599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.445911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.445962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.446004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.446046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.446519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.446563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.446620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.446688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.447107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.447124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.447139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.447153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.450500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.450547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.450589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.450634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.451061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.451106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.451148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.451199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.451556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.451572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.451591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.451605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.455366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.455426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.455477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.455530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.456049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.456112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.456156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.456198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.456593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.456609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.456624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.456643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.459980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.460027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.460068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.597 [2024-07-26 06:04:37.460121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.460531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.460588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.460630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.460676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.461138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.461159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.461175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.461190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.464348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.464411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.464460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.464502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.464930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.464975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.465016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.465058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.465489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.465507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.465523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.465538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.468681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.468727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.468783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.468824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.469339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.469384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.469426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.469469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.469905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.469922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.469938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.469953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.473070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.473117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.473158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.473198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.473661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.473706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.473748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.473790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.474173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.474190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.474205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.474225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.477556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.477605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.477651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.477697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.478154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.478198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.478240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.478285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.478773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.478791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.478807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.478823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.481911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.481962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.482004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.482047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.482463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.482508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.482549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.482590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.483013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.483031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.483046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.483061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.486068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.486115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.486173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.486214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.486728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.486777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.486819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.486861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.487282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.487299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.487314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.487329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.598 [2024-07-26 06:04:37.490317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.490362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.490402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.490443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.490904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.490949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.490991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.491033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.491439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.491456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.491471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.491486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.494577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.494624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.494675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.494718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.495168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.495212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.495254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.495306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.495706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.495724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.495738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.495753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.498768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.498819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.498860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.498902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.499358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.499415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.499468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.499510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.499896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.499913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.499927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.599 [2024-07-26 06:04:37.499942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.503177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.503245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.503286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.503351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.503806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.503858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.503914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.503966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.504383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.504399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.504415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.504430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.507475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.507531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.507573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.507614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.507997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.508043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.508084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.508133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.508618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.508635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.508656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.508672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.511747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.511818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.511871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.511912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.512329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.512373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.512414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.512457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.512895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.512913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.512928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.512943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.515750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.515795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.515850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.515891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.516407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.516451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.516495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.516537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.516963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.516981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.516999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.517014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.520130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.520182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.520224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.520266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.861 [2024-07-26 06:04:37.520736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.520784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.520826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.520868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.521245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.521261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.521276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.521290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.524162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.524208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.524250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.524291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.524753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.524810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.524851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.524893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.525363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.525382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.525397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.525412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.528357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.528406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.528449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.528492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.528948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.528994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.529036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.529079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.529517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.529535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.529550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.529565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.532711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.532760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.532802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.532844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.533325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.533370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.533412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.533455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.533856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.533873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.533888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.533903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.536780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.536828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.536870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.536912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.537387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.537434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.537489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.537543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.538019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.538038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.538053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.538067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.541956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.543869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.543916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.543956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.544008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.544309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.544355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.544405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.544446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.862 [2024-07-26 06:04:37.544716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.544732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.544747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.544762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.547550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.547597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.547645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.547691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.548000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.548051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.548093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.548137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.548407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.548427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.548441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.548456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.550348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.550405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.550446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.550487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.550801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.550845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.550886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.550926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.551238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.551255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.551270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.551284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.554198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.554245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.554290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.554330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.554650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.554705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.554749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.554789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.555058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.555075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.555089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.555103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.556921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.556979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.557860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.562134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.563525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.565079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.566722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.568377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.569924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.571469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.572450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.572922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.572938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.572954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.572968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.576784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.578339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.579875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.580741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.582647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.584191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.585405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.585799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.586234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.586251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.586271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.586287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.863 [2024-07-26 06:04:37.590153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.591702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.592423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.593730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.595643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.597114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.597504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.597895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.598296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.598312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.598327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.598342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.601975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.602808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.604296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.605845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.607670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.608070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.608458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.608852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.609287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.609305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.609320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.609334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.612388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.613897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.615364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.616992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.617840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.618237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.618626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.619020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.619452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.619469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.619484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.619499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.622309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.623597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.625146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.626701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.627518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.627915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.628318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.628710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.628984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.629001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.629015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.629029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.632096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.633653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.635193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.635893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.636774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.637165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.637555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.639021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.639343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.639360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.639374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.639393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.642759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.644310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.645286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.645693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.646509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.646904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.648094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.649405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.649684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.649701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.649715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.649729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.653055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.654340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.654736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.655125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.655936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.864 [2024-07-26 06:04:37.656877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.658164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.659706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.659981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.659997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.660011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.660026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.663389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.663787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.664176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.664566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.665569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.666860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.668415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.670000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.670327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.670344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.670359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.670374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.672396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.672796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.673184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.673574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.675323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.676880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.678430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.679490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.679774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.679791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.679805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.679819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.682004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.682398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.682796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.683494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.685426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.686975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.688344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.689669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.689977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.689993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.690007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.690021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.692444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.692843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.693334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.694635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.696456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.698094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.699171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.700477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.700757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.700774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.700788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.700803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.703721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.704115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.705584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.707135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.708967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.709876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.711185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.712737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.713011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.713028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.713042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.713056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.715849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.717485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.865 [2024-07-26 06:04:37.719112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.720668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.721693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.722999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.724543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.726086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.726416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.726433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.726447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.726462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.730902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.732491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.734079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.735475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.737143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.738689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.740221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.741115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.741591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.741610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.741627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.741647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.745331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.746889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.748530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.749575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.751485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.753042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.754052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.754441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.754885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.754904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.754919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.754934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.758593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.760235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.761197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.762484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.764301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.765429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:22.866 [2024-07-26 06:04:37.765824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.766213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.766688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.766705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.766721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.766736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.770156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.770876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.772187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.773738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.775439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.775835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.776224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.776612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.777052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.777070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.777085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.777100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.779507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.780982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.782537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.784066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.129 [2024-07-26 06:04:37.784739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.785131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.785519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.785917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.786265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.786282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.786296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.786310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.789512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.790975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.792606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.794152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.794987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.795378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.795772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.796855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.797187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.797203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.797218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.797232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.800234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.801779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.803408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.803805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.804623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.805017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.805905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.807220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.807493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.807509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.807523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.807537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.810716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.812267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.812669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.813064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.813842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.814511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.815819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.817348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.817620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.817636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.817655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.817669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.820913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.821647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.822043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.822440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.823295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.824701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.826244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.827796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.828070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.828087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.828102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.828116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.830663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.831074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.831462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.831855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.833932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.835560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.837115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.838471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.838817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.838838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.838852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.838866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.840862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.841257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.841655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.842048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.843621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.845176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.846720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.847519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.847856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.847873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.847887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.130 [2024-07-26 06:04:37.847902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.850000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.850393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.850788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.852295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.854115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.854917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.856457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.857899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.858175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.858192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.858206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.858220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.860950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.861353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.861760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.862151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.862950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.863345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.863742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.864138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.864573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.864590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.864606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.864621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.867236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.867629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.868023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.868412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.869181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.869576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.869969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.870371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.870818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.870836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.870851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.870866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.873668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.874065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.874462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.874864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.875716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.876107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.876498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.876903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.877287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.877304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.877328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.877342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.880153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.880554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.880947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.880992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.881807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.882201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.882595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.882991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.883435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.883456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.883472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.883487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.886180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.886572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.886977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.887368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.887419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.887799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.888205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.888595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.888988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.889383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.889816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.889833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.131 [2024-07-26 06:04:37.889847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.889862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.892211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.892257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.892297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.892343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.892795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.892846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.892888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.892930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.892995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.893387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.893404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.893419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.893433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.895769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.895815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.895857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.895899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.896364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.896439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.896496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.896537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.896579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.896928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.896947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.896962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.896976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.899303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.899350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.899391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.899446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.899825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.899890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.899944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.899991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.900053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.900478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.900495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.900509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.900524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.903028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.903085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.903128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.903169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.903532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.903593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.903636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.903684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.903725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.904194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.904211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.904227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.904242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.906667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.906723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.906767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.906846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.907297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.907363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.907406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.907447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.907488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.907928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.907946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.907963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.907983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.910240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.910290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.910332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.910388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.910876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.910929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.910972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.132 [2024-07-26 06:04:37.911015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.911057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.911459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.911476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.911491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.911505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.913951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.913995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.914037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.914078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.914505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.914557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.914599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.914647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.914689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.915170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.915187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.915202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.915217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.917555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.917601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.917653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.917696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.918123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.918177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.918219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.918260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.918300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.918723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.918740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.918755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.918770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.921043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.921091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.921133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.921174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.921637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.921698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.921741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.921782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.921824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.922237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.922258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.922272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.922287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.924733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.924778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.924819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.924859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.925300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.925352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.925395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.925436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.925482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.925821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.925848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.925863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.925878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.928198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.928244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.928286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.928328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.928750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.928815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.928872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.928929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.928983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.929430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.929446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.929461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.929475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.931886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.931932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.931978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.932020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.932395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.133 [2024-07-26 06:04:37.932457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.932499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.932557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.932609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.933013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.933031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.933046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.933061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.935673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.935736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.935789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.935830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.936202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.936269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.936313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.936354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.936396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.936836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.936853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.936868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.936883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.939257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.939317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.939360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.939416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.939826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.939889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.939932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.939973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.940015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.940442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.940459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.940474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.940492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.942808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.942854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.942895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.942937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.943428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.943482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.943525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.943566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.943608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.944027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.944045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.944061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.944077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.946500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.946546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.946592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.946634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.947083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.947135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.947177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.947218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.947260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.947682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.947703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.947719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.947734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.950023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.950072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.950113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.950156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.950538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.950594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.950636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.950683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.950723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.134 [2024-07-26 06:04:37.951185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.951204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.951219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.951235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.953527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.953573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.953615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.953673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.954148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.954201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.954244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.954285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.954326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.954742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.954760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.954775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.954789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.956662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.956707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.956748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.956790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.957233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.957285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.957326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.957368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.957425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.957912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.957929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.957946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.957963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.959664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.959710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.959766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.959808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.960075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.960134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.960176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.960219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.960261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.960527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.960543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.960557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.960571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.962250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.962302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.962344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.962385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.962839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.962896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.962939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.962982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.963024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.963434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.963450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.963465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.963479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.135 [2024-07-26 06:04:37.965337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.965382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.965423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.965464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.965733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.965795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.965837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.965879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.965919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.966415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.966431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.966445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.966460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.967972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.968018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.968059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.968101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.968534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.968586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.968629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.968676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.968718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.969150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.969167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.969183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.969197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.971383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.971434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.971481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.971522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.971794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.971853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.971895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.971937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.971977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.972330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.972351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.972365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.972380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.973920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.973963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.974004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.974044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.974547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.974611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.974657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.974699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.974739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.975179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.975196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.975213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.975229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.977271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.977316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.977357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.977397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.977666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.977725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.977767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.977807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.977854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.978122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.978138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.978153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.978167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.979836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.979884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.136 [2024-07-26 06:04:37.979925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.979965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.980277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.980337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.980380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.980437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.980479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.980984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.981002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.981017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.981032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.983216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.983265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.983309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.983351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.983619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.983682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.983724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.983765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.983805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.984070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.984086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.984100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.984114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.985762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.985806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.985847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.985895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.986165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.986222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.986276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.986319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.986361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.986825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.986843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.986858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.986872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.989989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.990006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.990020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.990034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.991697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.991741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.993280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.993326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.993709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.993770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.993825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.993867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.993908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.994380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.994402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.994417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.994432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.996526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.996570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.996611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.998164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.137 [2024-07-26 06:04:37.998435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:37.998494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:37.998535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:37.998576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:37.998617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:37.998973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:37.998990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:37.999005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:37.999019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.001076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.001471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.001864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.002252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.002605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.003918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.005444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.006980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.007916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.008187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.008204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.008218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.008232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.010225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.010618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.011020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.011794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.012102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.013739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.015286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.016563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.017970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.018277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.018293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.018307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.018321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.020480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.020882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.021412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.022710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.022980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.024626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.026153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.027319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.028601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.028878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.028896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.028910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.028924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.031466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.138 [2024-07-26 06:04:38.031863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.033342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.034897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.035168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.036755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.037714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.039023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.040552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.040831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.040847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.040862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.040876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.043507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.045041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.046635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.048165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.048436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.049306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.050606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.052139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.053706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.054078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.054095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.054110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.054124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.058345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.059937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.061480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.062745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.063054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.064366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.065924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.400 [2024-07-26 06:04:38.067464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.068049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.068528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.068546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.068566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.068580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.072343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.073892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.075145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.076577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.076895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.078462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.079991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.080591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.080988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.081415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.081433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.081448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.081462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.085127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.086517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.087850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.089159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.089433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.091015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.091694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.092085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.092478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.092954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.092973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.092991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.093007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.096191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.097329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.098632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.100155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.100428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.101407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.101818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.102207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.102595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.103049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.103066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.103081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.103095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.105331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.106645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.108191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.109744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.110068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.110475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.110869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.111267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.111689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.111960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.111977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.111991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.112006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.114962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.116516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.118055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.119154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.119561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.119963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.120354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.120751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.122393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.122672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.122688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.122703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.122717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.125973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.127530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.128818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.129209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.129669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.130073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.130461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.131976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.133449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.133729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.401 [2024-07-26 06:04:38.133755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.133769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.133783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.137056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.138512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.138909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.139299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.139750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.140151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.141527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.142834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.144375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.144654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.144671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.144685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.144704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.148011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.148412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.148806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.149194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.149668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.150929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.152232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.153785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.155324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.155751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.155768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.155782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.155796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.157659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.158053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.158440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.158834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.159132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.160437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.161982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.163517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.164208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.164484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.164501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.164517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.164531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.166545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.166949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.167340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.168543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.168863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.170416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.171958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.172754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.174198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.174473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.174490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.174504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.174518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.176895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.177291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.178338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.179655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.179928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.181495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.182502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.184125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.185742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.186015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.186032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.186048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.186062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.188609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.189635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.190940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.192448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.192728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.193709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.195313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.196932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.198494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.198773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.198790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.198804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.198819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.202134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.402 [2024-07-26 06:04:38.203444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.204990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.206545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.207002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.208538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.210109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.211660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.212894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.213329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.213347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.213361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.213376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.216883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.218417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.219969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.220739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.221042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.222654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.224185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.225450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.225852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.226303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.226321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.226336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.226356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.229965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.231507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.232241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.233539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.233820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.235391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.236704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.237093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.237482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.237932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.237960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.237976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.237992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.241254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.241989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.243292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.244838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.245112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.246559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.246956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.247347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.247743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.248178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.248196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.248211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.248226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.250580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.252061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.253609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.255161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.255438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.255854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.256245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.256633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.257029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.257363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.257379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.257393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.257408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.260657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.262211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.262905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.263296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.263768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.264171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.264558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.264964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.265365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.265826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.403 [2024-07-26 06:04:38.265844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.265860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.265875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.268468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.268868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.269257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.269654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.270011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.270415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.270812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.271201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.271593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.272053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.272074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.272089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.272104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.274915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.275312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.275715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.276113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.276595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.277003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.277394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.277795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.278196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.278567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.278583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.278598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.278612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.281308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.281714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.282104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.282493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.282947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.283348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.283752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.284144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.284532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.285012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.285029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.285048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.285064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.287683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.288085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.288478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.288884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.289270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.289683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.290074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.290462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.290866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.291315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.291331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.291345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.291359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.294113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.294513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.294911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.295301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.295763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.296163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.404 [2024-07-26 06:04:38.296554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.296955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.297348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.297825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.297843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.297858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.297873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.300678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.301072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.301461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.301862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.302255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.302676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.303068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.303455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.303851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.304300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.304317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.304331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.405 [2024-07-26 06:04:38.304345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.307012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.307409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.307458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.307858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.308300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.308712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.309101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.309489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.309891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.310269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.310286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.310304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.310318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.313514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.313927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.314322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.314369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.314765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.315163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.315555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.315961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.316360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.316829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.316847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.316862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.316880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.319128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.319176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.319217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.319259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.319726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.319779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.319821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.319864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.319905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.320341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.320357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.320371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.320386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.322706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.322753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.322794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.322835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.323260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.323313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.323355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.323397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.323439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.323846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.323863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.323877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.323891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.326114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.326165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.326207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.326248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.326673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.326727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.326770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.326811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.326851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.327276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.327294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.327313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.327328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.329600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.329653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.329696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.329740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.330183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.330238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.330280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.668 [2024-07-26 06:04:38.330321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.330361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.330809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.330827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.330842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.330856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.333311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.333357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.333399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.333440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.333872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.333938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.333983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.334025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.334066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.334400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.334418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.334432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.334447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.336823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.336871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.336914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.336957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.337397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.337461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.337504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.337565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.337620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.338046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.338064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.338079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.338095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.340478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.340526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.340567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.340609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.340961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.341025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.341068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.341109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.341162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.341551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.341572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.341586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.341601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.344324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.344384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.344446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.344499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.344927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.345008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.345059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.345101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.345142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.345580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.345596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.345611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.345625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.347932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.347979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.348024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.348075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.348429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.348495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.348539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.348580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.348621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.349057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.349076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.349094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.349110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.351253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.351306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.351348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.351389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.351742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.351806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.351849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.351889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.351930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.352287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.669 [2024-07-26 06:04:38.352304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.352318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.352333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.355025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.355073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.355127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.355187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.355577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.355650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.355705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.355748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.355789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.356194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.356211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.356225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.356240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.358535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.358581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.358625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.358671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.359001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.359060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.359106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.359147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.359188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.359451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.359468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.359482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.359496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.361172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.361220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.361261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.361302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.361564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.361627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.361677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.361718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.361759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.362092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.362108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.362124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.362139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.364904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.364953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.364993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.365843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.367443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.367487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.367527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.367568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.367838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.367899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.367941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.367982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.368035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.368304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.368320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.368334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.368349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.370768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.370816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.370860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.370908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.371207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.371263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.371304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.371344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.371386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.371691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.371708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.371722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.371737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.670 [2024-07-26 06:04:38.373378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.373431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.373480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.373521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.373850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.373917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.373958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.374000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.374041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.374303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.374319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.374334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.374348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.376657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.376703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.376745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.376787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.377191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.377248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.377289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.377329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.377370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.377716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.377734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.377748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.377762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.379370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.379415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.379456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.379503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.379779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.379836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.379878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.379923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.379984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.380254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.380270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.380284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.380298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.382395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.382441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.382485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.382526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.382979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.383034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.383079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.383121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.383163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.383428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.383445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.383459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.383473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.385988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.386004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.386022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.386036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.388030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.388078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.388131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.388172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.388650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.388706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.388751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.388793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.388834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.389210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.389227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.389241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.389255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.390789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.390836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.390879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.390921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.391193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.671 [2024-07-26 06:04:38.391250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.391291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.391340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.391381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.391661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.391678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.391693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.391707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.393663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.393712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.393758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.393804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.394164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.394226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.394268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.394309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.394350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.394801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.394821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.394836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.394851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.396390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.396434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.396475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.396515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.397010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.397075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.397117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.397157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.397198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.397519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.397535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.397550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.397564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.399309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.399355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.399398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.399439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.399885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.399940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.399984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.400037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.400079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.400557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.400576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.400592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.400607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.402300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.402362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.402403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.402445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.402715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.402779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.402822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.402864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.402904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.403174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.403191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.403205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.403219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.404873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.404931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.404977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.405018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.405461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.405514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.405556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.405598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.405645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.406002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.406019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.406034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.406053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.407918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.407964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.408004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.408045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.408311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.408374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.408415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.408455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.408495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.408877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.672 [2024-07-26 06:04:38.408894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.408908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.408922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.410455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.410509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.410551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.410592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.411015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.411067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.411112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.411154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.411195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.411636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.411660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.411675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.411690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.413874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.413919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.415551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.415601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.415876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.415937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.415981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.416023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.416063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.416349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.416366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.416380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.416394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.418013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.418062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.418115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.418508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.418967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.419034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.419089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.419131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.419172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.419629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.419652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.419670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.419685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.422729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.424110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.425417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.426975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.427250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.427929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.428320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.428716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.429109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.429475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.429491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.429505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.429520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.432043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.433342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.434884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.436420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.436818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.437239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.437630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.438033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.438565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.438849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.438867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.438882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.673 [2024-07-26 06:04:38.438897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.441880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.443431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.444987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.445752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.446272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.446680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.447072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.447618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.448903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.449177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.449194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.449208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.449223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.452407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.453963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.454903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.455310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.455765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.456169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.456559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.458181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.459735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.460012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.460028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.460043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.460057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.463336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.464484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.464880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.465270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.465763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.466165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.467802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.469441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.470986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.471261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.471277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.471292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.471306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.474294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.474701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.475093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.475486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.475935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.477216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.478508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.480035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.481566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.481957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.481975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.481989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.482003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.483933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.484328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.484725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.485115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.485385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.486708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.488259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.489801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.490518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.490820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.490837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.490852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.490866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.492899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.493300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.493695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.494807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.495132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.496704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.498253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.499143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.500694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.500979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.500995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.501009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.501023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.503486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.503889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.505130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.674 [2024-07-26 06:04:38.506432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.506712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.508274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.509090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.510586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.512162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.512437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.512453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.512467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.512482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.514946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.515777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.517089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.518650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.518927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.520131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.521637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.523056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.524620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.524903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.524920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.524935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.524949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.528459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.529777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.531322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.532874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.533284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.534782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.536331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.537884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.539076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.539474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.539491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.539505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.539520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.543074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.544630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.546177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.546902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.547175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.548815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.550382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.551755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.552150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.552600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.552618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.552633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.552656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.556233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.557785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.558551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.559837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.560115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.561692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.562964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.563357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.563756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.564152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.564169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.564183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.564198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.567612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.568442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.675 [2024-07-26 06:04:38.569934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.571483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.571760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.573358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.573761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.574149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.574538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.574986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.575004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.575019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.575034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.577544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.579116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.580763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.582311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.582587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.583001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.583396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.583793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.584185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.584534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.584551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.584565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.584579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.587339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.588659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.590204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.591750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.592224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.592629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.939 [2024-07-26 06:04:38.593023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.593411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.594444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.594783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.594800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.594814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.594828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.597754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.599295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.600829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.601224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.601688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.602092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.602482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.603194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.604498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.604776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.604793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.604807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.604821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.607973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.609532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.610264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.610659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.611108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.611524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.612033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.613340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.614894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.615174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.615190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.615204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.615219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.618468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.619268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.619668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.620061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.620555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.620963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.622487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.624094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.625646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.626033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.626049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.626062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.626076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.628208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.628605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.629002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.630578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.630884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.632477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.634028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.634443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.635893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.636168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.636184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.636198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.636212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.638605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.639009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.639402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.639800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.640276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.640683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.641071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.641457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.641853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.642256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.642273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.642287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.642302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.645138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.645536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.645931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.940 [2024-07-26 06:04:38.646319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.646784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.647185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.647579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.647977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.648372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.648832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.648854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.648868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.648883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.651710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.652105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.652495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.652891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.653335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.653747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.654148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.654540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.654936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.655328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.655345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.655359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.655374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.658047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.658447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.658846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.659240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.659705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.660108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.660498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.660894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.661292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.661714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.661730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.661745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.661759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.664536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.664944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.665343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.665741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.666193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.666606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.667007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.667396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.941 [2024-07-26 06:04:38.667789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.668266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.668286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.668302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.668317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.671019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.671414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.671809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.672213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.672676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.673080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.673468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.673859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.674250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.674677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.674693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.674707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.674723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.677410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.677813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.678205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.678593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.679030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.679431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.679832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.680224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.680617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.681076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.681094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.681109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.681124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.683767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.684165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.684568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.684963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.685339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.685754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.686150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.686541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.686948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.687396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.687412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.687427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.687441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.690209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.690608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.691007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.691402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.691862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.692268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.692664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.693057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.693451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.693915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.693933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.693951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.693966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.696723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.697121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.697514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.697908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.698357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.698764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.942 [2024-07-26 06:04:38.699159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.699551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.699944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.700409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.700426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.700442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.700457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.703140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.703541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.703587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.703983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.704394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.704806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.705197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.705584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.705977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.706382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.706399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.706413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.706427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.709067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.709462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.709865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.709915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.710370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.710777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.711165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.711553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.711953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.712308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.712325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.712340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.712354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.714981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.715050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.715092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.715158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.715540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.715602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.715663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.715729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.715782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.716219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.716236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.716250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.716265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.718605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.718658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.718699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.718741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.719110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.719172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.719214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.719274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.719329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.719806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.719823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.719838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.719853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.722009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.722073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.722127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.722171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.722541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.722602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.722648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.722690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.722730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.723042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.723060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.723074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.723089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.725443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.725490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.725532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.725586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.725949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.726013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.726066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.726109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.726176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.726652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.943 [2024-07-26 06:04:38.726669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.726683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.726701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.729082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.729127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.729167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.729207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.729541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.729605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.729652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.729694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.729735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.730004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.730021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.730035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.730049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.731707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.731763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.731807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.731848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.732115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.732176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.732217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.732257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.732299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.732630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.732652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.732667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.732681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.735322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.735373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.735414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.735459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.735737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.735798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.735843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.735884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.735924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.736192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.736208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.736223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.736237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.737853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.737898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.737942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.737983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.738254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.738317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.738358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.738399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.738448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.738725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.738742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.738756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.738770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.741233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.741280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.741324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.741366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.741635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.741699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.741740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.741792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.741839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.742116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.742133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.742147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.742161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.743900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.743944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.743985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.744789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.944 [2024-07-26 06:04:38.747146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.747197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.747238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.747280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.747658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.747715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.747756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.747797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.747838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.748164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.748180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.748194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.748216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.749822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.749876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.749921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.749971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.750240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.750290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.750352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.750396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.750437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.750712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.750729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.750743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.750757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.753048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.753095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.753138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.753180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.753620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.753681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.753727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.753767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.753807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.754108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.754124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.754139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.754153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.755774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.755819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.755861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.755901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.756211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.756269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.756310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.756352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.756393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.756665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.756682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.756696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.756711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.758918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.758972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.759992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.760006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.945 [2024-07-26 06:04:38.761626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.761676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.761722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.761763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.762068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.762127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.762168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.762209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.762254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.762524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.762540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.762554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.762568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.764790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.764841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.764893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.764934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.765420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.765476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.765519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.765563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.765606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.765952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.765969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.765983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.765997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.767558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.767602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.767648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.767690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.767962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.768016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.768068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.768110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.768163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.768434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.768450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.768464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.768479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.770516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.770564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.770609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.770656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.771086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.771148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.771192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.771234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.771276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.771720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.771737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.771752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.771767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.773282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.773326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.773372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.773423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.773827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.773884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.773925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.773966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.774006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.774337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.774353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.774367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.774381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.776294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.776341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.776387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.776428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.776838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.776909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.776952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.776994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.777035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.777463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.777480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.777495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.777514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.779110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.779154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.779195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.779235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.946 [2024-07-26 06:04:38.779649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.779710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.779771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.779813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.779854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.780128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.780145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.780159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.780174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.781930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.781977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.782020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.782061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.782507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.782558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.782601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.782648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.782696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.783176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.783193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.783208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.783223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.784935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.784979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.785851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.787600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.787676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.787719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.787760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.788205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.788260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.788304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.788346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.788387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.788746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.788763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.788778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.788792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.790668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.790717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.790761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.790800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.791066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.791126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.791167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.791207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.791248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.791624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.791644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.791659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.791673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.793247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.793293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.793335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.793376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.793810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.793875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.793918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.793960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.794002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.794438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.794455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.794470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.794485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.796713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.796758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.796817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.796858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.797126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.797206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.797248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.797289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.947 [2024-07-26 06:04:38.797329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.797604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.797620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.797635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.797655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.799246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.799290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.799331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.799372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.799800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.799868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.799909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.799951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.799991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.800409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.800429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.800445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.800460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.802543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.802587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.802628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.802672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.802940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.802999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.803041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.803081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.803121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.803385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.803405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.803419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.803433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.805203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.805248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.806439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.806485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.806906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.806970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.807015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.807057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.807098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.807534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.807551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.807566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.807580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.809718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.809773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.809813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.811358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.811633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.811697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.811739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.811781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.811822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.812095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.812111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.812126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.812140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.814190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.814592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.814989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.815812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.816133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.817722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.819268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.820453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.821935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.822253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.822269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.822283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.948 [2024-07-26 06:04:38.822298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.824600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.825006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.825577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.826882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.827153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.828803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.830306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.831477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.832783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.833054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.833071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.833085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.833099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.835820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.836217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.837645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.839187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.839460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:23.949 [2024-07-26 06:04:38.841114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.842149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.843441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.844987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.845257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.845273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.845287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.845302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.848026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.849450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.851002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.852543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.852819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.853872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.855152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.856683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.858219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.858575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.858591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.858605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.858620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.862491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.864041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.865588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.866723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.866996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.868301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.869833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.871360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.871903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.872373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.210 [2024-07-26 06:04:38.872393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.872414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.872429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.876097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.877644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.878848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.880306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.880618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.882179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.883716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.884303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.884700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.885131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.885147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.885162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.885176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.888803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.890209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.891498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.892797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.893066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.894642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.895399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.895800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.896192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.896670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.896688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.896703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.896721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.900054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.901165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.902472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.904019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.904290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.905268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.905681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.906072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.906469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.906919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.906937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.906952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.906966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.909244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.910549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.912093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.211 [2024-07-26 06:04:38.913643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.913957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.914361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.914756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.915147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.915539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.915814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.915831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.915845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.915859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.918937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.920477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.922012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.923192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.923585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.923988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.924381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.924779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.926173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.926484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.926500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.926514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.926528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.929702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.931343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.932818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.933211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.933657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.934061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.934452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.935608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.936906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.937177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.937193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.937207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.937222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.940477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.942114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.942511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.942906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.943271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.943686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.944673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.945970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.947530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.212 [2024-07-26 06:04:38.947809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.947827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.947845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.947859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.951103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.951507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.951905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.952298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.952745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.953546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.954851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.956378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.957918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.958236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.958252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.958267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.958282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.960453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.960856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.961249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.961650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.962040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.963390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.964934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.966478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.967558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.967838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.967856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.967870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.967884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.969944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.970344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.970741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.971514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.971835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.973413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.974942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.976188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.977623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.977941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.977958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.977973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.977987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.980333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.980742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.981136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.982684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.982957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.984601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.985313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.986865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.988421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.988774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.988791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.988807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.988821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.993416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.995065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.996617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.997893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.998248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.213 [2024-07-26 06:04:38.999530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.001102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.002657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.003606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.004029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.004046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.004061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.004075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.006810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.007211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.007604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.008002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.008443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.008859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.009255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.009652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.010046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.010527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.010544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.010560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.010575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.013370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.013775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.014168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.014562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.014941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.015346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.015743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.016136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.016527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.016923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.016940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.016955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.016974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.019705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.020112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.020507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.020902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.021359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.021771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.022163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.022558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.022976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.023427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.023444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.023460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.023475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.026482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.026891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.027283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.027681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.028063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.028469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.028865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.029258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.029655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.030057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.030074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.030088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.030103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.032832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.033231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.214 [2024-07-26 06:04:39.033625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.034025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.034469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.034880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.035271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.035667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.036062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.036448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.036465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.036480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.036494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.039210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.039612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.040011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.040403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.040867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.041270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.041668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.042063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.042455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.042886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.042903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.042918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.042932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.045657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.046079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.046475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.046873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.047229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.048775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.049173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.049565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.051031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.051520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.051537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.051553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.051568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.054066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.054665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.055873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.056263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.056683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.057090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.057746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.058898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.059288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.059660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.059677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.059691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.059705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.062584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.063602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.063999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.065111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.065457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.065869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.066261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.066660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.067798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.068155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.068171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.068186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.068201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.070924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.071330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.072678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.073136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.073568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.074746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.215 [2024-07-26 06:04:39.075369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.075765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.076156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.076541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.076558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.076573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.076587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.079084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.079484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.079884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.080282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.080554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.081126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.081519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.083103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.083507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.083953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.083972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.083987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.084002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.086494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.088058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.088449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.088849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.089226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.089631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.091052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.091444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.092160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.092436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.092453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.092467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.092483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.096073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.096473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.097352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.098286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.098722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.099127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.099529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.100419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.101346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.101780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.101800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.101815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.101830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.104318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.105241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.105289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.106008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.106443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.107353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.108258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.108652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.109044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.109390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.109407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.109422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.109436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.112077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.112476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.113440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.216 [2024-07-26 06:04:39.113489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.217 [2024-07-26 06:04:39.113863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.217 [2024-07-26 06:04:39.114919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.115651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.117143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.117537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.117963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.117979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.117994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.118008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.120239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.120301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.120343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.120385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.120830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.120890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.120933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.120974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.121015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.121450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.121467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.121483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.121500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.123862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.123921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.123964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.124005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.124480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.124532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.124574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.124617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.124664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.125022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.125038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.125053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.125067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.127065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.127111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.127151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.127192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.127455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.127511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.127552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.127594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.127634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.128013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.128029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.128043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.128057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.129632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.129683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.129725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.129765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.130161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.130227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.130272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.130314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.130356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.130808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.479 [2024-07-26 06:04:39.130825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.130840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.130856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.133179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.133232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.133277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.133318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.133586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.133655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.133696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.133738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.133778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.134101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.134119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.134134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.134148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.135745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.135791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.135831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.135880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.136273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.136334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.136376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.136417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.136458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.136903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.136921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.136936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.136951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.138991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.139897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.141583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.141628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.141678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.141719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.142199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.142275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.142331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.142373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.142414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.142860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.142878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.142893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.142909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.145979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.147763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.147811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.147852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.147892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.148185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.480 [2024-07-26 06:04:39.148243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.148285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.148326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.148380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.148848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.148866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.148881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.148897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.151979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.153675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.153720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.153764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.153812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.154081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.154132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.154181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.154225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.154267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.154655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.154672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.154687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.154702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.157949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.159685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.159730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.159771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.159811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.160077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.160135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.160176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.160217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.160258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.160764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.160781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.160796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.160812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.163285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.163329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.163369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.163409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.163717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.163777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.481 [2024-07-26 06:04:39.163818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.163858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.163899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.164162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.164179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.164193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.164207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.165930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.165984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.166900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.169696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.169747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.169802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.169846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.170115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.170175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.170220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.170260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.170302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.170574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.170591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.170606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.170622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.175633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.175696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.175739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.175780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.176132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.176194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.176243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.176284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.176325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.176765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.176783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.176798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.176813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.181421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.181478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.181521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.181562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.181833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.181893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.181935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.181976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.182017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.482 [2024-07-26 06:04:39.182286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.182303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.182318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.182332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.186956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.191291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.191342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.191384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.191425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.191843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.191901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.191944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.191985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.192027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.192418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.192435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.192450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.192465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.196128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.196179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.196220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.196261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.196563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.196620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.196668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.196711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.196752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.197018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.197035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.197049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.197064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.201526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.201576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.201622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.201668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.201974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.202031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.202073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.202114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.202156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.202423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.202440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.202454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.202469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.207196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.207255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.207301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.483 [2024-07-26 06:04:39.207344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.207759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.207822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.207865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.207909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.207952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.208385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.208402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.208418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.208434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.212993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.213007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.213022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.217081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.217133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.217176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.217220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.217550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.217606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.217652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.217694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.217735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.218035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.218051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.218066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.218081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.222676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.222728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.222770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.222810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.223108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.223165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.223209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.223250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.223308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.223774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.223792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.223807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.223827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.227995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.228071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.228439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.228484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.228526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.228858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.228874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.228889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.259774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.259848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.260293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.260346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.260713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.261094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.270306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.270374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.484 [2024-07-26 06:04:39.271015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.271071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.271433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.271484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.271851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.272295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.272312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.272328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.272342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.275261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.276728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.278092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.279637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.280579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.280973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.281361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.281752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.282146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.282162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.282176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.282190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.285019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.286321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.287853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.289398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.290172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.290562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.290954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.291732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.292043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.292060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.292074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.292088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.295034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.296589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.298128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.298609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.299440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.299835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.300456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.301765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.302034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.302050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.302065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.302083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.305252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.306808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.307444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.307838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.308665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.309123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.310458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.311995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.485 [2024-07-26 06:04:39.312265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.312282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.312298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.312313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.315593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.316426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.316825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.317214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.318077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.319583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.321143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.322690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.322957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.322974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.322988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.323003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.325620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.326036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.326426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.326820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.328872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.330512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.332059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.333422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.333752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.333768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.333782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.333797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.335860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.336254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.336648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.337221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.338953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.340594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.342104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.343301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.343605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.343622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.343636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.343656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.345920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.346315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.346888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.348197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.350065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.351636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.352777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.354074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.354345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.354361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.354375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.354389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.357058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.357570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.358864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.360415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.362331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.363395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.364699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.366239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.366510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.366527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.366541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.366555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.369481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.370802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.372247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.373463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.375039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.376484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.377373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.379019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.379546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.379564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.379580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.379595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.486 [2024-07-26 06:04:39.383162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.384711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.386254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.387323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.388878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.390429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.391961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.392441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.392895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.392913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.392928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.392946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.396658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.398414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.398999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.400303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.402200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.403724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.404114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.404502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.404907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.404924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.404939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.404954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.407560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.407967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.408360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.408759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.409527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.409923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.410315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.410711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.411173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.411189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.411204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.411219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.413990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.414403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.414804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.415193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.415973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.416368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.416768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.417161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.417592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.417609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.417624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.417644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.420266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.420660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.749 [2024-07-26 06:04:39.421053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.421440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.422211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.422603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.422994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.423383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.423838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.423857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.423872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.423887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.426551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.426950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.427346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.427746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.428586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.428984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.429375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.429781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.430185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.430203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.430217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.430232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.433133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.433533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.433928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.434318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.435117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.435519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.435914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.436302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.436694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.436711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.436726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.436741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.439424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.439832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.440224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.440617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.441393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.441787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.442175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.442569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.443038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.443056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.443071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.443086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.445852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.446253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.446658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.447049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.447853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.448249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.448649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.449046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.449508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.449528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.449544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.449559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.452173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.452566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.452961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.453350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.454083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.454477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.454869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.455259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.455696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.455714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.455729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.455744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.458479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.458878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.459278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.459676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.460564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.460974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.461366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.461765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.462279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.462301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.750 [2024-07-26 06:04:39.462317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.462332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.465142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.465548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.465945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.466334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.467167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.467576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.467974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.468368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.468777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.468794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.468809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.468823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.471434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.471833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.472224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.472614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.473355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.473753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.474141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.474528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.474902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.474919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.474934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.474949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.477644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.478042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.478436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.478836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.479633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.480031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.480429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.480833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.481292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.481310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.481327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.481342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.483984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.485188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.485580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.485973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.486777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.487169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.487562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.487958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.488398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.488415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.488430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.488445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.491313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.491717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.492107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.492950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.494914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.496462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.497753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.499153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.499469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.499485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.499506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.499521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.501733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.502126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.502719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.504025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.505932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.507422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.508610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.509917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.510186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.510202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.510216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.510231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.512790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.512842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.513454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.514761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.516665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.518164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.519344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.751 [2024-07-26 06:04:39.520644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.520915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.520931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.520945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.520959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.526658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.528215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.529771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.529818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.531357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.531407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.532721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.532767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.533039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.533055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.533070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.533084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.535561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.535611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.536650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.536695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.538557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.538607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.540154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.540874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.541145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.541161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.541176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.541190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.543196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.543247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.543634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.543680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.544134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.544179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.545338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.545384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.545699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.545716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.545730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.545747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.547384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.547436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.547492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.547537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.549408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.549463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.550845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.550890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.551297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.551314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.551329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.551343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.553713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.553764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.553804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.553845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.554154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.554197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.554238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.554279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.554543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.554559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.554573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.554588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.556265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.556320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.556361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.556401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.556712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.556757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.556803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.556844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.557310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.557327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.557342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.557358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.559528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.559580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.559624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.559674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.752 [2024-07-26 06:04:39.559974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.560037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.560078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.560119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.560386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.560402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.560417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.560431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.562126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.562171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.562212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.562252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.562556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.562599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.562645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.562686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.563073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.563090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.563105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.563121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.565530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.565575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.565616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.565664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.566034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.566077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.566117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.566158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.566424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.566440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.566455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.566469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.568142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.568191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.568240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.568285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.568595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.568648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.568690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.568734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.569019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.569035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.569050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.569065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.571633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.571693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.571736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.571777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.572088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.572144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.572188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.572233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.572501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.572517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.572532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.572546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.574191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.574238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.574279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.574320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.574630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.574681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.574723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.574764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.575033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.575049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.575064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.575078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.577534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.577582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.577625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.577674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.577993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.578037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.578078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.578130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.578402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.578419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.578433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.578447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.580117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.753 [2024-07-26 06:04:39.580174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.580216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.580260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.580575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.580618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.580667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.580709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.580975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.580992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.581006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.581021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.583309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.583356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.583402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.583446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.583884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.583928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.583969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.584011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.584357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.584373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.584387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.584401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.586932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.589122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.589167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.589210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.589254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.589746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.589799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.589857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.589902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.590173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.590190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.590204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.590219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.591843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.591889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.591930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.591973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.592328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.592371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.592412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.592453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.592721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.592738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.592752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.592766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.595154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.595200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.595246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.754 [2024-07-26 06:04:39.595287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.595784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.595828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.595872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.595913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.596237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.596253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.596267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.596281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.597895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.597940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.597994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.598037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.598345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.598395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.598457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.598500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.598775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.598792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.598806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.598820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.600773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.600820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.600864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.600907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.601351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.601396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.601437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.601479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.601951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.601972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.601988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.602003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.603533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.603577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.603625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.603676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.604040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.604084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.604125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.604166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.604476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.604493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.604507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.604521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.606383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.606429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.606471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.606514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.606973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.607017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.607058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.607099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.607513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.607531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.607546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.607561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.609192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.609236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.609277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.609321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.609717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.609761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.609807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.609847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.610119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.610136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.610151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.610165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.611853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.611899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.611939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.611980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.612446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.612491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.612534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.612576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.613050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.613067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.613082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.755 [2024-07-26 06:04:39.613098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.614882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.614928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.614969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.615026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.615332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.615383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.615426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.615467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.615793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.615810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.615828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.615842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.617480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.617526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.617587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.617629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.618135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.618180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.618222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.618265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.618662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.618679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.618693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.618708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.620648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.620694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.620735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.620776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.621086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.621129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.621170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.621211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.621557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.621574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.621588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.621603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.623135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.623181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.623228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.623270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.623741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.623790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.623831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.623874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.624325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.624343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.624359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.624374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.626427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.626472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.626513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.626873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.626919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.626970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.627238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.627254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.627268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:24.756 [2024-07-26 06:04:39.652734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.654285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.654335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.654387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.655075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.655350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.656993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.657361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.657415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.657784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.658174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.659249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.659593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.659609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.659680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.659733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.661264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.661309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.661359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.017 [2024-07-26 06:04:39.662886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.663267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.663285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.663299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.665194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.665588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.665984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.666374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.666703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.666720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.668018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.669551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.671086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.671801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.672075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.672093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.672108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.674096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.674491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.674888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.675926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.676273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.676290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.677862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.679407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.680313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.681895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.682169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.682186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.682200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.684490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.684892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.685794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.687096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.687370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.687387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.688953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.690043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.691656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.693188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.693463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.693480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.693494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.696016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.696849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.698162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.699721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.699994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.700011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.701158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.702738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.704227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.705855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.706129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.706145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.706159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.709426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.710740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.712281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.713811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.714186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.714205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.715791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.717300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.718879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.720521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.720913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.720930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.720945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.724356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.725921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.727469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.728186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.728462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.728478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.729951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.731591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.733116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.733510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.733965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.733985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.018 [2024-07-26 06:04:39.734001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.737566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.738567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.739895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.741440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.741725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.741745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.742274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.742671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.743060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.743447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.743809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.743826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.743841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.746502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.747813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.749362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.750909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.751304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.751320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.751731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.752122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.752510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.753450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.753769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.753788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.753803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.756747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.758326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.759878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.760699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.761145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.761161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.761557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.761953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.762341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.762766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.763143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.763160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.763174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.766043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.766450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.766849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.767240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.767679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.767699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.768099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.768492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.768890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.769283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.769728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.769745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.769760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.772411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.772814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.773205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.773601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.773978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.773995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.774408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.774802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.775191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.775585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.775971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.775989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.776003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.778742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.779149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.779544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.779942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.780342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.780359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.780764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.781156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.781549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.781948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.782424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.782443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.782458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.785073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.785465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.019 [2024-07-26 06:04:39.785863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.786252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.786670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.786689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.787087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.787477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.787874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.788264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.788713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.788731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.788745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.791477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.791886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.792286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.792690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.793142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.793160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.793563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.793960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.794351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.794759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.795178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.795195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.795210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.797857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.798259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.798655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.799046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.799505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.799525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.799933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.800327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.800726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.801116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.801615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.801632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.801655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.804244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.804645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.805041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.805447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.805857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.805875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.806276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.806671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.807059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.807448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.807918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.807935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.807950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.810753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.811150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.811542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.811937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.812380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.812397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.812801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.813194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.813587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.813987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.814431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.814448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.814463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.817251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.817658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.818048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.818440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.818839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.818857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.819256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.819653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.820041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.820433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.820848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.820866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.820880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.823567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.823973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.824372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.824770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.825217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.020 [2024-07-26 06:04:39.825235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.825631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.826032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.826423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.826824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.827292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.827311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.827327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.829984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.830402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.830797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.831187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.831617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.831633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.832042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.832437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.832835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.833224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.833692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.833710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.833726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.836387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.836788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.837348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.838585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.838929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.838946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.840219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.840610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.841004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.841406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.841864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.841882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.841898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.844676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.845077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.845480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.845888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.846376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.846393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.846802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.847192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.847582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.847989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.848331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.848347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.848362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.850684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.851818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.853125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.854677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.854951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.854968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.855841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.857318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.858867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.860418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.860695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.860716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.860731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.864199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.865513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.867059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.868615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.868991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.869008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.870430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.871994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.873546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.874684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.874959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.874976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.874990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.877416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.878879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.880235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.881776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.882052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.882069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.882826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.021 [2024-07-26 06:04:39.884138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.885692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.887253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.887586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.887603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.887618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.891323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.892720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.894307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.894362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.894633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.894657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.895489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.896796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.898348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.899909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.900319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.900336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.900351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.902735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.903136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.904581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.904632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.904914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.904930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.906570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.907994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.908039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.909474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.909799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.909816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.909830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.912008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.912410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.912457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.913148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.913436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.913453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.915072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.915126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.916668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.917740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.918017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.918033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.918048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.921199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.921257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.921655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.922256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.922531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.022 [2024-07-26 06:04:39.922549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.922608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.923007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.923885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.925180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.925454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.925470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.925484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.928624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.928682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.930224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.930800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.931276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.931294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.931348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.931746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.931790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.932177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.932555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.932572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.932591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.935168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.935218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.936515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.936560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.936840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.936857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.938424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.938473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.939113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.940290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.940755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.940774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.940789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.944169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.944220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.944262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.944303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.944574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.944591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.944655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.946199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.946918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.946964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.947236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.947252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.947266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.948938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.948982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.949023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.949069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.949499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.949517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.949567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.949609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.949659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.949714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.950177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.950193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.950209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.951897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.951942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.951990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.952809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.954443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.954488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.317 [2024-07-26 06:04:39.954532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.954573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.954845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.954868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.954926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.954969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.955015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.955056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.955490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.955506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.955521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.957549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.957593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.957634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.957682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.957998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.958014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.958071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.958113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.958154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.958194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.958458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.958475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.958489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.960775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.961101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.961117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.961136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.963766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.963815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.963860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.963901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.964176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.964192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.964263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.964305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.964345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.964386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.964658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.964675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.964690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.966892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.967162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.967178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.967192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.318 [2024-07-26 06:04:39.969979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.970430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.970447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.970462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.971999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.972044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.972084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.972133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.972505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.972522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.972573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.972615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.972662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.972703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.973024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.973040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.973054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.974845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.974893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.974938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.974981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.975399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.975415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.975465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.975507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.975552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.975593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.976037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.976055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.976071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.977687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.977732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.977773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.977813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.978133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.978150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.978210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.978260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.978304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.978344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.978609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.978626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.978647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.980942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.981370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.981387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.981403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.983385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.983429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.983476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.983517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.983792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.983816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.983868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.983922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.983966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.984007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.984273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.984289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.984303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.985977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.986021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.986065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.986106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.986371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.986393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.986448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.986491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.986533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.986573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.987000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.987017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.987031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.989433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.989478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.989518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.989558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.989899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.989920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.989980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.990021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.990061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.990102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.990366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.990382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.990396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.992992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.993008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.993023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.994964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.995008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.995048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.995097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.995576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.995592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.995649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.995692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.995734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.995780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.996163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.996179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.996194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.997719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.997766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.997812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.997855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.998130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.998147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.998200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.998240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.998281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.998327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.998599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.998616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:39.998630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.000518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.000566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.000608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.000658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.001047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.001064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.001118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.001159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.001201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.319 [2024-07-26 06:04:40.001244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.001687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.001705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.001724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.003976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.004312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.004328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.004342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.005954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.006000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.006040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.006081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.006547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.006566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.006622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.006671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.006713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.006754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.007176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.007193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.007207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.009767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.010036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.010052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.010066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.011694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.011738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.011780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.011828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.012098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.012114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.012167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.012217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.012258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.012300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.012726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.012744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.012759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.014979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.015025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.015069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.015135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.015554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.015613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.015711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.015807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.015891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.015984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.016464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.016518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.016562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.018637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.018697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.018738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.018780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.019092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.019108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.019165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.019208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.019249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.019305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.019579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.019594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.019608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.022095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.022142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.022188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.022981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.023309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.023326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.023382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.023424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.023465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.023506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.023785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.023802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.023816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.025412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.025462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.025504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.027090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.027367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.027384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.027447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.027497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.027898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.027944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.028379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.028396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.028412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.030415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.030460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.032013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.032060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.032333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.032350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.032407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.033152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.033200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.033243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.033557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.033575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.033590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.035208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.035972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.036020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.036067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.036500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.036521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.038088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.038147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.038193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.038239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.038686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.038710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.038728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.040369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.041590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.041643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.320 [2024-07-26 06:04:40.041690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.041960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.041977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.043278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.043325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.044939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.044992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.045262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.045279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.045293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.048098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.049642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.049689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.051215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.051658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.051675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.051732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.053099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.053145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.053186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.053455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.053475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.053490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.055135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.055530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.056778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.057344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.057790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.057808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.059178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.059225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.059272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.060891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.061165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.061181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.061195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.066392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.066797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.067188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.067576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.067853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.067870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.069171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.070729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.072273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.073170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.073481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.073497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.073512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.076104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.076499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.077758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.078300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.078752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.078770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.080148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.081443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.082991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.084558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.085023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.085040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.085054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.089960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.090358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.091861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.093267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.093538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.093554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.095119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.095850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.097146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.098684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.098955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.098971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.098986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.101884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.102760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.103154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.104584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.104900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.104917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.106363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.107193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.108711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.110321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.110603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.110620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.110634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.114960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.116497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.117442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.119062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.119334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.119350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.120918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.122460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.123155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.124270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.124722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.124740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.124755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.128035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.129590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.131147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.132084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.132354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.132370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.133682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.135229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.136783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.137184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.137633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.137661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.137677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.142402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.143482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.144769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.146345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.146617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.146634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.147576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.148854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.149388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.149782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.150054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.150070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.150084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.152661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.153053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.153441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.153838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.154271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.154287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.154922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.156107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.156496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.157513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.157844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.157861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.157876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.161254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.161660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.162055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.163567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.164038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.164056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.164460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.166028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.166422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.166817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.167197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.167213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.321 [2024-07-26 06:04:40.167228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.170034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.170451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.171167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.172262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.172708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.172726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.173470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.174540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.174933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.175328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.175707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.175725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.175740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.179789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.180465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.180859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.182378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.182850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.182867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.183272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.183696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.184092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.184482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.184936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.184954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.184968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.188932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.189330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.189887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.191132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.191571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.191588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.191996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.192399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.192798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.193188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.193581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.193598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.193612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.197890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.198829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.199221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.199612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.200003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.200021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.200423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.200817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.201205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.322 [2024-07-26 06:04:40.201593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.584 [2024-07-26 06:04:40.201985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.202003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.202023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.205458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.205869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.206263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.206659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.207044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.207061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.207480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.207874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.208263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.208664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.209067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.209083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.209098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.212824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.213243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.213645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.214035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.214468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.214485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.214889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.215280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.215679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.216899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.217280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.217297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.217312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.219898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.220296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.220692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.221087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.221524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.221541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.221946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.222339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.222749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.224409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.224915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.224934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.224950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.228121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.228519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.228913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.229306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.229748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.229768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.230485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.231590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.231984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.233065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.233409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.233426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.233440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.236093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.236491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.236887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.237278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.237679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.237697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.239076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.239511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.239913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.241454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.241931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.241948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.241963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.245396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.245805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.246760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.247610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.248055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.248072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.249090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.249881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.250271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.250665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.585 [2024-07-26 06:04:40.251037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.251055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.251070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.253922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.254334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.255986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.256380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.256827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.256846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.258447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.258843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.259234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.260665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.261085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.261103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.261117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.265519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.266794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.267313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.267708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.267981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.267998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.268597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.268992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.269388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.269792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.270063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.270079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.270094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.272820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.273225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.273764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.275043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.275504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.275522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.276121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.277329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.277725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.278873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.279185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.279203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.279217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.285356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.285974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.287320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.287715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.288103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.288120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.289436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.289832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.290864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.292168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.292440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.292457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.292472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.295581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.297130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.297723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.299112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.299595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.299614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.300134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.301430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.301824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.302852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.303188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.303205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.303219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.309353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.309973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.311392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.311785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.312194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.312210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.313609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.314006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.314952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.316255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.316525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.316541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.586 [2024-07-26 06:04:40.316555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.319699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.321269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.321876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.321924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.322196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.322215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.322618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.323228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.324421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.324818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.325185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.325202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.325216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.330777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.332339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.332958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.333006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.333283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.333301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.333714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.334385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.334434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.335394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.335837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.335855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.335870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.338754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.340139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.340201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.341708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.341983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.341999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.343571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.343619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.344396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.345434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.345869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.345887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.345902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.351463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.351519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.352255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.353685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.353957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.353973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.354027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.355683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.357138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.358245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.358568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.358584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.358599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.361699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.361749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.363058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.364590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.364867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.364889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.364949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.365684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.365731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.367014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.367287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.367302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.367317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.372964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.373017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.373405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.373449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.373729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.373746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.375054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.375101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.376637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.378181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.378573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.378589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.378603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.380817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.587 [2024-07-26 06:04:40.380868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.380910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.380951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.381326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.381343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.381400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.381799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.383170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.383221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.383678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.383695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.383711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.387984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.388252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.388269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.388283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.390875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.391184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.391200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.391215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.394832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.394888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.394933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.394975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.395241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.395257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.395313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.395364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.395407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.395448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.395720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.395737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.395751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.397390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.397435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.397478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.397519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.397966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.397983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.398036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.398078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.398141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.398185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.398456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.398472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.398486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.402860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.403137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.403153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.403167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.404837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.404883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.404928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.404969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.405442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.405459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.588 [2024-07-26 06:04:40.405510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.405552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.405595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.405636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.405959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.405976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.405990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.409920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.409970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.410996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.411010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.412623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.412672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.412713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.412753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.413232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.413249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.413305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.413347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.413391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.413434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.413861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.413879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.413893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.418507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.418557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.418598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.418643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.418956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.418972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.419031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.419073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.419118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.419159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.419425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.419441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.419456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.421937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.422323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.422339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.422353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.426743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.426794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.426835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.426876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.427208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.427225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.427287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.427338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.427379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.427420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.427693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.427710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.427725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.429362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.429407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.429447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.429488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.589 [2024-07-26 06:04:40.429820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.429837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.429896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.429937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.429980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.430022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.430461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.430478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.430494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.435589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.435644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.435692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.435736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.436003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.436019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.436076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.436118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.436161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.436202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.436474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.436490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.436504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.438770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.439224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.439252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.439267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.443874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.444238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.444254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.444268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.445790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.445836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.445877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.445918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.446192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.446208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.446261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.446302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.446343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.446388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.446894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.446912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.446927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.449504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.449567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.449616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.449661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.449931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.590 [2024-07-26 06:04:40.449948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.450006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.450047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.450087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.450127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.450435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.450452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.450467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.452811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.453159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.453175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.453189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.455698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.455749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.455789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.455829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.456092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.456107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.456169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.456214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.456254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.456304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.456576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.456592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.456606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.458927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.459201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.459217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.459231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.461871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.461926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.461967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.462780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.464438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.464483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.464527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.464575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.464848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.464864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.464916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.464966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.465008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.465050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.465406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.465422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.465436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.591 [2024-07-26 06:04:40.469164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.469214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.469254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.469302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.469575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.469591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.469654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.469696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.469743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.469788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.470060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.470077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.470091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.471784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.471829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.471870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.471915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.472182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.472198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.472255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.472297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.472338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.472378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.472772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.472789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.472804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.477961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.478907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.480567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.480616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.480677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.482226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.482497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.482514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.482572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.482616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.482667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.482708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.482984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.483000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.483014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.485959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.486017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.486059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.487628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.487905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.487922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.487986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.592 [2024-07-26 06:04:40.488031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.489376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.489425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.489714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.489730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.489744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.491363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.491408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.493069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.493128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.493611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.493629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.493686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.494224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.494268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.494309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.494582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.494598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.494617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.498233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.499761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.499815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.499856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.500128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.500143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.501686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.501739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.501785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.501825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.502095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.502118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.502133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.506605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.508078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.508134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.508175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.508449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.508465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.510024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.510075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.511656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.511700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.512016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.512032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.854 [2024-07-26 06:04:40.512046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.516014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.516414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.516469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.517985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.518455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.518472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.518526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.519211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.519258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.519298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.519611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.519628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.519647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.524248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.525814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.526407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.527686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.528146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.528164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.528731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.528778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.528820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.529751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.530190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.530211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.530226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.535946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.537495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.539064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.539860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.540143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.540160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.540560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.541359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.542356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.542748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.543068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.543084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.543098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.548413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.549964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.550551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.551799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.552256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.552274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.552894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.554083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.554473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.555260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.555537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.555553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.555567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.561997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.562403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.562880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.564211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.564676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.564693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.565361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.566670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.568218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.569765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.570096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.570113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.570127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.575272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.575707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.577093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.577483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.577875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.577892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.579198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.855 [2024-07-26 06:04:40.580712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.582253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.583347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.583622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.583642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.583656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.587711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.589342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.589736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.590414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.590691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.590708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.592367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.593926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.595207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.596252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.596582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.596600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.596616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.602179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.602929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.603322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.603721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.604192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.604214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.605855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.606254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.606864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.608043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.608525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.608544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.608559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.613271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.613680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.614073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.614469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.614818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.614836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.615766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.616159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.617437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.617960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.618409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.618427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.618443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.621864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.622262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.622675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.623072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.623347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.623363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.623813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.624204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.625780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.626174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.626600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.626616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.626630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.629611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.630018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.630435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.631101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.631377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.631394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.631807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.632461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.633620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.634016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.856 [2024-07-26 06:04:40.634407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.634423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.634438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.637375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.637786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.638183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.639314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.639681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.639698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.640105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.641268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.641914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.642307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.642693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.642711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.642725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.645707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.646116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.646511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.648117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.648596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.648614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.649021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.650580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.650974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.651367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.651782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.651800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.651814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.655489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.655906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.656646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.657713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.658143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.658162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.658920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.659972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.660365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.660759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.661142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.661159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.661174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.666248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.666654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.668072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.668468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.668915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.668938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.670374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.670773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.671165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.671558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.671923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.671941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.671955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.678288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.678701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.680180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.680569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.680980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.680997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.682596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.682989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.683380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.683797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.684202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.857 [2024-07-26 06:04:40.684219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.684233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.689523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.690074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.691318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.691713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.692078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.692095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.693432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.693825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.694217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.694611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.695019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.695036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.695050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.699959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.700665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.701769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.702159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.702500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.702516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.703715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.704107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.704497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.704895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.705255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.705272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.705286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.709580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.710519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.711390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.711785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.712089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.712106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.713066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.713456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.713852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.714248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.714531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.714548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.714562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.718602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.719775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.720426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.720819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.721101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.721117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.721909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.722299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.722696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.723093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.723374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.723391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.723405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.727024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.728366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.728836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.729227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.729505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.729521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.730117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.730507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.731673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.732303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.732578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.732595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.732609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.735861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.736274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.736674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.738100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.738587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.738604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.739017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.740475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.740872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.741264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.741682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.741699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.741713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.745485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.745904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.858 [2024-07-26 06:04:40.746626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.747716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.748162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.748179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.748939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.749991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.750381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.750776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.751131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.751148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.751163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.757426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:25.859 [2024-07-26 06:04:40.758745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.760271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.761881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.762274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.762291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.763606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.765155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.766700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.767697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.767983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.768000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.768014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.772190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.773744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.775296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.776164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.776490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.776506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.778077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.779628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.780760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.782166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.782593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.782609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.782624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.786626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.788185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.788930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.790224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.790499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.790516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.792094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.793341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.794635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.795150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.795589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.795607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.795622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.802179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.802907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.804211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.805763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.806041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.806057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.121 [2024-07-26 06:04:40.807374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.808653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.809181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.809569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.809852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.809869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.809884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.814123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.815453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.817000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.817046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.817320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.817336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.818716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.819942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.820527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.820926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.821205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.821221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.821235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.825455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.826847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.828401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.828448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.828726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.828742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.830157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.831360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.831407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.831852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.832299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.832317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.832332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.839043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.839829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.839877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.841184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.841461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.841477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.843038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.843085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.843953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.845602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.846113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.846131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.846146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.850311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.850377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.851833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.853050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.853366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.853383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.853442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.854980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.856515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.857149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.857423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.857444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.857458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.861567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.861630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.863172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.864477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.864819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.864836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.864893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.866185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.866230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.867775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.868050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.868066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.868080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.871561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.871616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.872917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.872962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.873234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.873251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.874823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.874871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.875866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.877497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.877775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.122 [2024-07-26 06:04:40.877792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.877807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.884090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.884144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.884185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.884230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.884686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.884704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.884756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.886007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.887303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.887348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.887623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.887643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.887658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.892460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.892514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.892557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.892599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.892881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.892898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.892954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.892995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.893036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.893089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.893572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.893589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.893604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.896853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.897172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.897188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.897203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.902817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.903094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.903110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.903125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.906768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.906817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.906859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.906907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.907278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.907294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.907348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.907389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.907430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.907471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.907773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.907790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.907808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.123 [2024-07-26 06:04:40.913094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.913149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.913190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.913230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.913502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.913518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.913577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.913619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.913665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.913707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.914133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.914151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.914169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.918643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.918701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.918748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.918792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.919060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.919076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.919140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.919184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.919225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.919265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.919535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.919551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.919565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.922895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.922946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.922989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.923036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.923469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.923485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.923544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.923585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.923626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.923671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.923984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.924000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.924014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.928522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.928572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.928613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.928657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.928930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.928947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.929006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.929047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.929095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.929136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.929556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.929572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.929587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.933779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.934220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.934236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.934250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.938585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.938635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.938700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.938743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.939212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.939229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.939281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.939324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.939366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.939409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.939774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.939791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.939805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.124 [2024-07-26 06:04:40.943827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.943886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.943927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.943968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.944241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.944257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.944319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.944367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.944413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.944453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.944729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.944747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.944763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.947564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.947617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.947664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.947705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.947975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.947991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.948049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.948091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.948139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.948184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.948456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.948473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.948487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.953762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.954184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.954202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.954217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.958497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.958548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.958600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.958658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.958935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.958951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.959002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.959054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.959095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.959136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.959403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.959420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.959434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.963846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.964116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.964133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.964147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.968519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.968571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.968611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.968659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.969092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.969110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.969161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.969204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.969246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.969295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.969710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.969727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.969741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.973393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.973445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.973486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.973527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.125 [2024-07-26 06:04:40.973839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.973855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.973912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.973954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.973994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.974036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.974305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.974323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.974338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.978711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.978763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.978808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.978849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.979167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.979183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.979242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.979284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.979326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.979367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.979645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.979662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.979676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.984474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.984529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.984571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.984612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.985016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.985034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.985089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.985131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.985173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.985216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.985657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.985676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.985691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.989410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.989465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.989511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.989555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.989835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.989854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.989907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.989948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.989996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.990038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.990309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.990325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.990339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.994524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.994576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.994620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.994669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.994957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.994983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.995038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.995093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.995140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.995190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.995463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.995480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:40.995494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:41.000030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:41.000082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:41.000127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:41.000168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:41.000670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:41.000688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:41.000751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.126 [2024-07-26 06:04:41.000792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.000834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.000874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.001331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.001348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.001363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.004978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.005030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.005071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.005112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.005546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.005563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.005627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.005676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.005718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.005762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.006067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.006083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.006097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.010826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.011133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.011150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.011164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.015936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.015987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.016992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.021064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.021122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.021175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.022661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.022989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.023005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.023063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.023104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.023145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.023185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.023495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.023513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.127 [2024-07-26 06:04:41.023527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.027591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.027651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.027695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.028881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.029189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.029205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.029267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.029308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.030745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.030794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.031161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.031178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.031193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.034876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.034928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.035319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.035363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.035666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.035684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.035744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.037022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.037067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.037109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.037384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.037401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.037415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.389 [2024-07-26 06:04:41.042307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.042718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.042766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.042808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.043256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.043273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.043675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.043723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.043767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.043809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.044234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.044251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.044266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.048620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.050179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.050229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.050270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.050538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.050555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.051342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.051398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.051792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.051837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.052239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.052257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.052271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.055915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.057128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.057178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.058469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.058748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.058766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.058827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.060367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.060413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.060461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.060849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.060866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.060881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.064268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.065833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.066726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.068171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.068483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.068500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.070103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.070152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.070193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.071794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.072211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.072228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.072243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.075696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.076095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.076488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.076888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.077238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.077255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.077671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.078061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.078449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.078846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.390 [2024-07-26 06:04:41.079201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.079218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.079232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.082792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.083207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.083596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.083993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.084419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.084436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.084848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.085261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.085661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.086051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.086485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.086502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.086517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.089987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.090390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.090789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.091180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.091624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.091650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.092055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.092447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.092847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.093239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.093687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.093707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.093722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.097355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.097762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.098158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.098563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.099024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.099042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.099443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.099841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.100234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.100633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.101059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.101078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.101092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.104522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.104929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.105319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.105718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.106071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.106088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.106491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.106887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.107277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.107673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.108070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.108091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.108107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.111793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.112201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.112591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.112987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.113406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.113424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.113840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.114238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.114629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.115027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.115458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.391 [2024-07-26 06:04:41.115478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.115493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.118992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.119396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.119795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.120186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.120664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.120684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.121084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.121475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.121881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.122274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.122724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.122743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.122757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.126291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.126700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.127105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.127501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.127965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.127984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.128384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.128781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.129174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.129566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.129985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.130004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.130019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.133392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.133800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.134191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.134603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.134979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.134997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.135402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.135799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.136202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.136595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.136965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.136984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.136999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.140485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.140900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.141292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.141689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.142069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.142086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.142490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.142895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.143285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.143683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.144112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.144130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.144145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.148547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.148959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.149358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.149763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.150252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.150270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.150682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.151075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.151469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.151876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.152272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.152289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.152303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.155715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.156115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.156505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.156900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.392 [2024-07-26 06:04:41.157250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.157267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.157679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.158070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.158470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.158889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.159162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.159178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.159197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.164779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.165956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.166348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.166744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.167201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.167219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.167616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.169273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.170910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.172464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.172746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.172764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.172778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.175757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.176157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.176547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.176942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.177396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.177416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.178974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.180435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.182039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.183587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.183948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.183965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.183979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.185955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.186349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.186745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.187139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.187409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.187426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.188735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.190286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.191901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.192874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.193193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.193209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.193223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.195324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.195725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.196114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.197741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.198017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.198033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.199596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.201140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.201922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.203215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.203486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.203503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.203518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.205892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.393 [2024-07-26 06:04:41.206286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.207827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.209304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.209580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.209596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.211167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.211899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.213205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.214758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.215030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.215046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.215061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.217604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.219134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.220575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.222187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.222464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.222480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.223234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.224535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.226091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.227651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.227969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.227985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.228000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.232166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.233787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.235378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.236810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.237128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.237145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.238446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.239992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.241538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.242252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.242752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.242770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.242789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.246473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.248040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.249428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.250733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.251076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.251092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.252667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.254221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.254878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.255269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.255705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.255723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.255738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.259408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.260817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.262116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.263413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.263694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.263711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.265282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.265988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.266378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.266772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.267242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.267259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.267274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.270595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.271667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.272970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.274519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.274809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.274826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.275729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.276122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.394 [2024-07-26 06:04:41.276510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.276911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.277332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.277349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.277363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.279717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.281012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.282541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.284089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.284455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.284473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.284889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.285279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.285673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.286257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.286530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.286548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.286563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.289466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.291014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.292552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.293438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.293912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.293930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.395 [2024-07-26 06:04:41.294330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.294739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.295193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.296533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.296815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.296831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.296846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.300056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.301604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.302600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.303011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.303459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.303479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.303885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.304275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.305750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.307307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.307581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.307597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.307611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.310825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.311939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.312331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.312379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.312835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.312853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.659 [2024-07-26 06:04:41.313257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.313683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.315052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.316583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.316865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.316882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.316899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.320131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.321129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.321535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.321582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.322046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.322063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.322465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.322998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.323047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.324349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.324620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.324636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.324657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.327796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.329334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.329382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.330041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.330532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.330550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.330956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.331002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.331389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.332241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.332589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.332606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.332619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.335572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.335624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.337165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.338724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.339195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.339218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.339279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.339679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.340068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.340460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.340798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.340816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.340830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.343783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.343834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.345340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.346982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.347254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.347270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.347325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.347730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.347775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.348163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.348579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.348595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.348610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.351989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.352041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.352769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.352816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.353090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.353107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.354756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.354806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.356339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.357518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.357967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.357992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.358007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.361490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.361541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.361582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.361650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.660 [2024-07-26 06:04:41.361923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.361939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.361993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.363565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.364714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.364761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.365089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.365105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.365119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.366952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.366998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.367041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.367084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.367508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.367525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.367576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.367619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.367667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.367709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.368150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.368167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.368182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.369785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.369835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.369877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.369918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.370324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.370340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.370398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.370441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.370483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.370524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.370842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.370858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.370873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.372621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.372672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.372716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.372759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.373206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.373223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.373273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.373329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.373372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.373413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.373875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.373893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.373908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.375607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.375661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.375703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.375745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.376048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.376073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.376130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.376174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.376217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.376262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.376535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.376551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.376565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.378988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.379416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.379433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.379448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.661 [2024-07-26 06:04:41.381174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.381220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.381265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.381321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.381589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.381605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.381730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.381777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.381820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.381862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.382137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.382153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.382167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.383808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.383856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.383897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.383939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.384406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.384423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.384480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.384524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.384568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.384611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.385044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.385061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.385075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.386993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.387997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.388014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.388028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.389622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.389679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.389721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.389783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.390269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.390286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.390337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.390381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.390423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.390466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.390878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.390895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.390910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.392875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.392920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.392966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.393908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.395447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.395500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.395543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.395586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.396028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.396045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.396102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.396147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.662 [2024-07-26 06:04:41.396190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.396233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.396701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.396719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.396734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.398985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.399970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.401519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.401565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.401607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.401663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.402075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.402091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.402142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.402184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.402225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.402270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.402730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.402748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.402763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.404886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.404931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.404976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.405913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.407727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.407775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.407818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.407860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.408285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.408302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.408356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.408399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.408440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.408482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.408918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.408936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.408953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.410584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.410629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.410683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.410725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.411098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.411116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.411172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.411218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.411260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.411301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.411602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.411618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.663 [2024-07-26 06:04:41.411633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.413363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.413408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.413450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.413494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.413945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.413962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.414013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.414056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.414103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.414145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.414625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.414649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.414665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.416373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.416421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.416465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.416507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.416801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.416818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.416876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.416924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.416966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.417011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.417282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.417298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.417312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.418968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.419014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.419056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.419097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.419527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.419544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.419597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.419647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.419691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.419734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.420221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.420238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.420253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.421995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.422925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.424575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.424634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.424683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.424726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.425234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.425251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.425302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.425345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.425388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.425431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.425824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.425841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.425857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.427793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.664 [2024-07-26 06:04:41.427838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.427882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.427924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.428193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.428209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.428266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.428308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.428360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.428415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.428851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.428867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.428882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.430420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.430477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.430519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.430564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.430983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.431000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.431053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.431095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.431139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.431182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.431614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.431634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.431656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.434844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.435334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.435351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.435368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.437745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.437791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.437834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.437877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.438284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.438300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.438349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.438395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.438438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.438480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.438918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.438935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.438950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.441385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.441433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.441479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.441877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.665 [2024-07-26 06:04:41.442295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.442312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.442386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.442439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.442481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.442522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.442890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.442909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.442923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.445432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.445479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.445532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.445932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.446360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.446376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.446442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.446484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.446877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.446920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.447332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.447348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.447370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.449682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.449729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.450117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.450173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.450680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.450698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.450750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.451140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.451204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.451259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.451722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.451738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.451753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.454313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.454714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.454764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.454805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.455202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.455219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.455620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.455675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.455717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.455761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.456209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.456226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.456240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.458609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.459006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.459052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.459099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.459475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.459492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.459897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.459943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.460334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.460381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.460747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.460764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.460779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.463620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.464045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.464114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.464503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.464955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.464973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.666 [2024-07-26 06:04:41.465028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.465415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.465457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.465500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.465950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.465968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.465983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.468289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.468689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.469080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.469473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.469884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.469901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.470314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.470361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.470407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.470798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.471267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.471284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.471302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.473977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.474370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.474765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.475159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.475579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.475596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.475997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.476386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.476778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.477178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.477567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.477583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.477599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.480620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.481024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.481415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.481809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.482275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.482292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.482698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.483109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.483515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.483914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.484295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.484312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.484327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.487015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.487407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.487802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.488193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.488581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.488598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.489005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.489395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.489790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.490181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.490545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.490562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.490576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.493324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.493732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.494131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.494522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.494903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.494920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.495318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.495716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.496110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.496501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.667 [2024-07-26 06:04:41.496964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.496986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.497002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.499590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.499983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.500372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.500765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.501160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.501177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.501580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.501993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.502381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.502771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.503196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.503213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.503228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.505910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.506311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.507794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.508448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.508725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.508743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.509147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.509535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.509929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.510321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.510683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.510700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.510714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.513472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.513874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.514278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.514673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.515077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.515094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.515489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.515889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.516286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.516687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.517127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.517144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.517160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.519833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.520228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.520616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.521011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.521439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.521456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.521865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.522272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.522667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.523063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.523501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.523518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.523533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.525752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.527052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.528597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.530151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.530424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.530440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.530852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.531245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.531632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.532028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.532298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.532314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.532328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.535513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.537186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.538729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.540041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.540483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.540502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.668 [2024-07-26 06:04:41.540913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.541304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.541699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.543114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.543418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.543434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.543448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.546366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.547921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.549562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.549959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.550402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.550419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.550821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.551211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.552435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.553738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.554009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.554026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.554040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.557187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.558733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.559130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.559521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.559929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.559951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.560353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.561085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.669 [2024-07-26 06:04:41.562380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.563912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.564184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.564200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.564214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.567424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.568285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.568679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.569069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.569582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.569599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.570073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.571403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.572947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.574488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.574766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.574783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.574798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.577582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.577989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.578380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.578777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.579229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.935 [2024-07-26 06:04:41.579247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.580709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.582076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.583618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.585274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.585623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.585645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.585660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.587528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.587925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.588315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.588708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.588986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.589001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.590317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.591857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.593383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.594098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.594373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.594389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.594403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.596409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.596809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.597199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.598185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.598494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.598511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.600078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.601620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.602633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.604278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.604559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.604575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.604589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.606854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.607253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.608053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.609347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.609618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.609634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.611206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.612406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.613891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.615283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.615558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.615574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.615588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.618180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.618708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.620002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.621545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.621827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.621845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.623319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.624544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.625849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.627397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.627679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.627696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.627711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.630506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.631819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.633357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.634909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.635182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.635199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.636409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.936 [2024-07-26 06:04:41.637717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.639247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.640786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.641187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.641204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.641218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.645344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.646915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.648470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.649732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.650049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.650066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.651376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.652932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.654468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.655035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.655497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.655514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.655529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.659292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.660844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.662151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.663498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.663814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.663830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.665396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.666946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.667580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.667972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.668430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.668446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.668460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.672112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.673767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.674756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.676054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.676326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.676342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.677904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.678954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.679343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.679739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.680191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.680207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.680224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.683479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.684191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.685488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.687025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.687295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.687311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.688765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.689159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.689549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.689945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.690401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.690418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.690434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.693087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.694682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.696225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.697860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.698132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.698148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.698552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.698948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.699336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.699729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.700072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.937 [2024-07-26 06:04:41.700089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.700103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.702843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.704138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.705682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.707230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.707631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.707653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.708057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.708447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.708840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.709505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.709785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.709802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.709817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.712709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.714264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.715823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.715871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.716213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.716230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.716636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.717030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.717419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.717979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.718249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.718266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.718281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.721187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.722729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.724272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.724318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.724679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.724696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.725095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.725487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.725533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.725945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.726384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.726401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.726416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.728732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.730221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.730273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.731824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.732099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.732115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.733655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.733703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.734090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.734478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.734938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.734960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.734974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.738294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.738344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.739063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.740369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.740649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.740668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.740729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.742278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.743498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.743894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.938 [2024-07-26 06:04:41.744354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.744371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.744386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.747902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.747954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.749498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.750194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.750466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.750483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.750542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.752127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.752172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.753700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.753972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.753988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.754003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.757488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.757538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.758839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.758884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.759155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.759171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.760741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.760787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.761478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.762793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.763065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.763082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.763096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.765341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.765392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.765433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.765474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.765920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.765938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.765989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.767313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.768616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.768665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.768935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.768952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.768966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.770614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.770670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.770716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.770757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.771023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.771039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.771097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.771142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.771182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.771223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.771564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.771581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.771595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.774731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.775001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.775017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.775032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.776631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.776680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.776720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.776761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.777024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.777040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.777096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.777138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.777179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.777220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.939 [2024-07-26 06:04:41.777486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.777502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.777520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.779913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.779959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.780928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.782539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.782583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.782624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.782679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.782946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.782962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.783020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.783061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.783110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.783152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.783575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.783592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.783607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.785921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.785966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.786842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.788520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.788565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.788605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.788650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.788921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.788937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.788996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.789038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.789079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.789119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.789500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.789516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.789531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.792917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.793187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.793204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.793218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.794801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.794846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.794886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.794927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.795194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.795210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.940 [2024-07-26 06:04:41.795267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.795308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.795348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.795404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.795678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.795695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.795709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.798755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.799025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.799041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.799059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.800691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.800740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.800781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.800821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.801092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.801107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.801166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.801208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.801249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.801290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.801557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.801573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.801587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.803906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.803954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.803995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.804036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.804484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.804501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.804555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.804597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.804643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.804685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.805024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.805040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.805054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.806654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.806698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.806739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.806780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.807089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.807105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.807161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.807202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.807243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.807285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.807554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.807570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.807584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.809752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.809796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.809841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.809883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.810311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.941 [2024-07-26 06:04:41.810328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.810376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.810419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.810460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.810502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.810775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.810792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.810806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.812424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.812469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.812511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.812574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.812849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.812866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.812923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.812971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.813033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.813077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.813351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.813367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.813382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.815335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.815381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.815423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.815465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.815832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.815849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.815909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.815952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.815993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.816034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.816474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.816491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.816507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.818821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.818880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.818923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.818964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.819435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.819451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.819503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.819546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.819588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.819630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.820021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.820038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.820052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.822354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.822399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.822443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.822486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.822931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.822949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.822999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.823042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.823085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.823137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.823512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.823529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.823544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.825885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.825930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.825972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.826016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.826443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.942 [2024-07-26 06:04:41.826459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.826539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.826593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.826636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.826684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.827024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.827041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.827055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.829493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.829539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.829581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.829623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.830036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.830054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.830115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.830169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.830212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.830281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.830780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.830798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.830812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.833918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.834363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.834380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:26.943 [2024-07-26 06:04:41.834394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.836690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.836739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.836810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.836854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.837217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.837234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.837296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.837350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.837392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.837437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.837932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.837952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.837967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.840316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.840376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.840434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.840477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.840895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.840914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.840979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.841021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.841062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.841104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.841541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.841558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.841574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.843832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.843878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.843920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.843961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.844399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.844418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.844468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.844510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.844552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.844595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.845021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.845039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.845054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.847520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.847569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.847610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.847657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.848101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.848118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.848170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.848213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.848256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.848298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.848699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.848716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.848731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.851048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.851095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.851141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.851532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.852001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.852018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.852071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.852114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.852156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.852198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.852604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.852620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.852635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.855001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.855046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.855088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.855474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.855874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.855896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.241 [2024-07-26 06:04:41.855963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.856007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.856396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.856443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.856884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.856901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.856916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.859275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.859321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.859723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.859788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.860269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.860286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.860345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.860740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.860796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.860838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.861317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.861335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.861352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.863642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.864034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.864078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.864119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.864553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.864573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.864981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.865031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.865074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.865114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.865477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.865494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.865509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.867872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.868268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.868316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.868357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.868735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.868752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.869149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.869197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.869583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.869626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.870121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.870139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.870153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.872535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.872940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.872988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.873374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.873810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.873827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.873893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.874288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.874336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.874378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.874843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.874859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.874874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.877259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.877665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.878061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.878459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.878882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.878899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.879302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.879349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.879392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.879791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.880189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.880206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.880221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.883462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.883870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.884263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.242 [2024-07-26 06:04:41.884659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.885135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.885152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.885551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.885951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.886355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.886751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.887173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.887190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.887204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.889872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.890266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.890662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.891055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.891522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.891538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.891961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.892352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.892744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.893132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.893521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.893537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.893551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.896280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.897106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.898067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.899232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.899557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.899574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.899984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.900372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.900783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.901175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.901556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.901573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.901588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.904224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.904626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.905028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.905419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.905853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.905870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.906265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.906663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.907061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.907457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.907931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.907951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.907967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.910703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.911101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.911493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.911889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.912315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.912332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.912746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.913139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.913530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.913926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.914366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.914383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.914399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.917271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.918705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.920037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.921587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.921870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.921886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.922460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.922856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.923247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.923646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.924052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.924068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.924082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.926644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.927947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.929500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.931053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.931440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.931456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.931881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.243 [2024-07-26 06:04:41.932273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.932671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.933202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.933478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.933494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.933509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.936382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.937938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.939484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.940562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.940985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.941005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.941407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.941803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.942195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.943726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.944064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.944080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.944094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.947419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.948986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.950336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.950732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.951170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.951187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.951591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.951991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.953272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.954579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.954856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.954873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.954887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.958066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.959624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.960026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.960416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.960807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.960825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.961221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.961970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.963262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.964817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.965091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.965107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.965121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.968353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.968976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.969367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.969763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.970239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.970256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.970697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.972051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.973598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.975148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.975421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.975443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.975457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.978195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.978593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.978989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.979382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.979832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.979850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.981275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.982596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.984135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.985770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.986136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.986152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.986166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.988139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.988536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.988935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.989327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.989622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.989644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.990940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.992487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.994032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.994772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.995044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.995062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.244 [2024-07-26 06:04:41.995076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:41.997096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:41.997496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:41.997892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:41.998859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:41.999169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:41.999185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.000749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.002288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.003308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.004953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.005229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.005245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.005259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.007632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.008034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.008731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.010026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.010300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.010317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.011883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.013202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.014570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.015871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.016145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.016162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.016176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.018819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.019218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.020768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.022338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.022616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.022632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.024275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.025350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.026650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.028184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.028457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.028473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.028487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.031145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.032758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.034397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.035950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.036225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.036241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.037173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.038467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.040018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.041573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.041952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.041970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.041985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.046174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.047594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.049176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.050803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.051176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.051193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.052496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.054053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.055597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.056477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.056973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.056989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.057011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.060451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.062012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.063636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.064602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.064922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.064938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.066504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.068054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.069033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.069432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.069896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.069914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.069930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.073469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.075017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.075723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.077026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.245 [2024-07-26 06:04:42.077300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.246 [2024-07-26 06:04:42.077316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.246 [2024-07-26 06:04:42.078938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.246 [2024-07-26 06:04:42.080371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.246 [2024-07-26 06:04:42.080767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.246 [2024-07-26 06:04:42.081158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.246 [2024-07-26 06:04:42.081538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.246 [2024-07-26 06:04:42.081555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.246 [2024-07-26 06:04:42.081569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:27.246 [2024-07-26 06:04:42.085042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:36:28.180 00:36:28.180 Latency(us) 00:36:28.180 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:28.180 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:28.180 Verification LBA range: start 0x0 length 0x100 00:36:28.180 crypto_ram : 6.04 42.36 2.65 0.00 0.00 2936051.98 260776.29 2523876.84 00:36:28.180 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:28.180 Verification LBA range: start 0x100 length 0x100 00:36:28.180 crypto_ram : 5.94 43.07 2.69 0.00 0.00 2880944.75 317308.22 2348810.24 00:36:28.180 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:28.180 Verification LBA range: start 0x0 length 0x100 00:36:28.180 crypto_ram1 : 6.04 42.36 2.65 0.00 0.00 2838164.70 258952.68 2334221.36 00:36:28.180 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:28.180 Verification LBA range: start 0x100 length 0x100 00:36:28.180 crypto_ram1 : 5.94 43.06 2.69 0.00 0.00 2788702.83 315484.61 2173743.64 00:36:28.180 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:28.180 Verification LBA range: start 0x0 length 0x100 00:36:28.180 crypto_ram2 : 5.63 267.28 16.70 0.00 0.00 429039.05 2564.45 623674.77 00:36:28.180 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:28.180 Verification LBA range: start 0x100 length 0x100 00:36:28.181 crypto_ram2 : 5.59 286.13 17.88 0.00 0.00 401658.39 43310.75 616380.33 00:36:28.181 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:28.181 Verification LBA range: start 0x0 length 0x100 00:36:28.181 crypto_ram3 : 5.71 273.61 17.10 0.00 0.00 406011.17 15614.66 357427.65 00:36:28.181 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:28.181 Verification LBA range: start 0x100 length 0x100 00:36:28.181 crypto_ram3 : 5.66 293.83 18.36 0.00 0.00 381485.90 63826.37 470491.49 00:36:28.181 =================================================================================================================== 00:36:28.181 Total : 1291.70 80.73 0.00 0.00 746114.64 2564.45 2523876.84 00:36:28.439 00:36:28.439 real 0m9.269s 00:36:28.439 user 0m17.546s 00:36:28.439 sys 0m0.505s 00:36:28.439 06:04:43 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:28.439 06:04:43 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:36:28.439 ************************************ 00:36:28.439 END TEST bdev_verify_big_io 00:36:28.439 ************************************ 00:36:28.439 06:04:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:28.439 06:04:43 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:28.439 06:04:43 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:36:28.439 06:04:43 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:28.439 06:04:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:28.439 ************************************ 00:36:28.439 START TEST bdev_write_zeroes 00:36:28.439 ************************************ 00:36:28.439 06:04:43 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:28.439 [2024-07-26 06:04:43.321014] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:28.439 [2024-07-26 06:04:43.321076] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1333264 ] 00:36:28.714 [2024-07-26 06:04:43.447236] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:28.714 [2024-07-26 06:04:43.547965] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:28.714 [2024-07-26 06:04:43.569303] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:28.714 [2024-07-26 06:04:43.577325] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:28.714 [2024-07-26 06:04:43.585343] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:28.972 [2024-07-26 06:04:43.694353] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:31.501 [2024-07-26 06:04:45.902259] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:31.501 [2024-07-26 06:04:45.902326] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:31.501 [2024-07-26 06:04:45.902342] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:31.501 [2024-07-26 06:04:45.910277] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:31.501 [2024-07-26 06:04:45.910297] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:31.501 [2024-07-26 06:04:45.910308] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:31.501 [2024-07-26 06:04:45.918298] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:31.501 [2024-07-26 06:04:45.918315] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:31.501 [2024-07-26 06:04:45.918327] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:31.501 [2024-07-26 06:04:45.926319] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:31.501 [2024-07-26 06:04:45.926336] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:31.501 [2024-07-26 06:04:45.926347] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:31.501 Running I/O for 1 seconds... 00:36:32.435 00:36:32.435 Latency(us) 00:36:32.435 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:32.435 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:32.435 crypto_ram : 1.03 2003.94 7.83 0.00 0.00 63429.13 5613.30 76591.64 00:36:32.435 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:32.435 crypto_ram1 : 1.03 2009.46 7.85 0.00 0.00 62898.84 5584.81 71120.81 00:36:32.435 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:32.435 crypto_ram2 : 1.02 15426.81 60.26 0.00 0.00 8172.16 2464.72 10770.70 00:36:32.435 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:32.435 crypto_ram3 : 1.02 15459.00 60.39 0.00 0.00 8129.27 2464.72 8491.19 00:36:32.435 =================================================================================================================== 00:36:32.435 Total : 34899.20 136.33 0.00 0.00 14504.47 2464.72 76591.64 00:36:32.693 00:36:32.693 real 0m4.208s 00:36:32.693 user 0m3.795s 00:36:32.693 sys 0m0.371s 00:36:32.693 06:04:47 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:32.693 06:04:47 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:36:32.693 ************************************ 00:36:32.693 END TEST bdev_write_zeroes 00:36:32.693 ************************************ 00:36:32.693 06:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:32.693 06:04:47 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:32.693 06:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:36:32.693 06:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:32.693 06:04:47 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:32.693 ************************************ 00:36:32.693 START TEST bdev_json_nonenclosed 00:36:32.693 ************************************ 00:36:32.693 06:04:47 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:32.951 [2024-07-26 06:04:47.604540] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:32.951 [2024-07-26 06:04:47.604599] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1333804 ] 00:36:32.951 [2024-07-26 06:04:47.731146] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:32.951 [2024-07-26 06:04:47.829996] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:32.951 [2024-07-26 06:04:47.830060] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:36:32.951 [2024-07-26 06:04:47.830078] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:36:32.951 [2024-07-26 06:04:47.830091] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:36:33.210 00:36:33.210 real 0m0.385s 00:36:33.210 user 0m0.232s 00:36:33.210 sys 0m0.149s 00:36:33.210 06:04:47 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:36:33.210 06:04:47 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:33.210 06:04:47 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:36:33.210 ************************************ 00:36:33.210 END TEST bdev_json_nonenclosed 00:36:33.210 ************************************ 00:36:33.210 06:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:36:33.210 06:04:47 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # true 00:36:33.210 06:04:47 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:33.210 06:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:36:33.210 06:04:47 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:33.210 06:04:47 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:33.210 ************************************ 00:36:33.210 START TEST bdev_json_nonarray 00:36:33.210 ************************************ 00:36:33.210 06:04:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:33.210 [2024-07-26 06:04:48.071750] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:33.210 [2024-07-26 06:04:48.071811] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1333861 ] 00:36:33.468 [2024-07-26 06:04:48.199673] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:33.468 [2024-07-26 06:04:48.303570] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:33.468 [2024-07-26 06:04:48.303647] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:36:33.468 [2024-07-26 06:04:48.303666] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:36:33.469 [2024-07-26 06:04:48.303678] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:36:33.727 00:36:33.727 real 0m0.395s 00:36:33.727 user 0m0.232s 00:36:33.727 sys 0m0.161s 00:36:33.727 06:04:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:36:33.727 06:04:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:33.727 06:04:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:36:33.727 ************************************ 00:36:33.727 END TEST bdev_json_nonarray 00:36:33.727 ************************************ 00:36:33.727 06:04:48 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # true 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@786 -- # [[ crypto_qat == bdev ]] 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@793 -- # [[ crypto_qat == gpt ]] 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@797 -- # [[ crypto_qat == crypto_sw ]] 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # cleanup 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:36:33.727 06:04:48 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:36:33.727 00:36:33.727 real 1m12.428s 00:36:33.727 user 2m40.518s 00:36:33.727 sys 0m8.975s 00:36:33.727 06:04:48 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:33.727 06:04:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:33.727 ************************************ 00:36:33.727 END TEST blockdev_crypto_qat 00:36:33.727 ************************************ 00:36:33.727 06:04:48 -- common/autotest_common.sh@1142 -- # return 0 00:36:33.727 06:04:48 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:36:33.727 06:04:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:36:33.727 06:04:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:33.727 06:04:48 -- common/autotest_common.sh@10 -- # set +x 00:36:33.727 ************************************ 00:36:33.727 START TEST chaining 00:36:33.727 ************************************ 00:36:33.727 06:04:48 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:36:33.985 * Looking for test storage... 00:36:33.985 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:36:33.985 06:04:48 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@7 -- # uname -s 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:36:33.985 06:04:48 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:36:33.985 06:04:48 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:36:33.985 06:04:48 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:36:33.985 06:04:48 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:33.985 06:04:48 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:33.985 06:04:48 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:33.985 06:04:48 chaining -- paths/export.sh@5 -- # export PATH 00:36:33.985 06:04:48 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@47 -- # : 0 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:36:33.985 06:04:48 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:36:33.985 06:04:48 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:36:33.985 06:04:48 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:36:33.985 06:04:48 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:36:33.985 06:04:48 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:36:33.985 06:04:48 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:36:33.985 06:04:48 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:36:33.986 06:04:48 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:33.986 06:04:48 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:33.986 06:04:48 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:33.986 06:04:48 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:36:33.986 06:04:48 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:36:33.986 06:04:48 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:36:33.986 06:04:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:42.095 06:04:56 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:36:42.095 06:04:56 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:36:42.095 06:04:56 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:36:42.095 06:04:56 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:36:42.095 06:04:56 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:36:42.095 06:04:56 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@296 -- # e810=() 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@297 -- # x722=() 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@298 -- # mlx=() 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@336 -- # return 1 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:36:42.096 WARNING: No supported devices were found, fallback requested for tcp test 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:36:42.096 Cannot find device "nvmf_tgt_br" 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@155 -- # true 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:36:42.096 Cannot find device "nvmf_tgt_br2" 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@156 -- # true 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:36:42.096 Cannot find device "nvmf_tgt_br" 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@158 -- # true 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:36:42.096 Cannot find device "nvmf_tgt_br2" 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@159 -- # true 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:36:42.096 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@162 -- # true 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:36:42.096 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@163 -- # true 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:36:42.096 06:04:56 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:36:42.097 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:36:42.097 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.105 ms 00:36:42.097 00:36:42.097 --- 10.0.0.2 ping statistics --- 00:36:42.097 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:42.097 rtt min/avg/max/mdev = 0.105/0.105/0.105/0.000 ms 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:36:42.097 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:36:42.097 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.071 ms 00:36:42.097 00:36:42.097 --- 10.0.0.3 ping statistics --- 00:36:42.097 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:42.097 rtt min/avg/max/mdev = 0.071/0.071/0.071/0.000 ms 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:36:42.097 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:36:42.097 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.038 ms 00:36:42.097 00:36:42.097 --- 10.0.0.1 ping statistics --- 00:36:42.097 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:42.097 rtt min/avg/max/mdev = 0.038/0.038/0.038/0.000 ms 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@433 -- # return 0 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:36:42.097 06:04:56 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:36:42.355 06:04:57 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:36:42.355 06:04:57 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:36:42.355 06:04:57 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:42.355 06:04:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:42.355 06:04:57 chaining -- nvmf/common.sh@481 -- # nvmfpid=1337680 00:36:42.355 06:04:57 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:36:42.355 06:04:57 chaining -- nvmf/common.sh@482 -- # waitforlisten 1337680 00:36:42.355 06:04:57 chaining -- common/autotest_common.sh@829 -- # '[' -z 1337680 ']' 00:36:42.355 06:04:57 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:42.355 06:04:57 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:42.355 06:04:57 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:42.355 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:42.355 06:04:57 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:42.355 06:04:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:42.355 [2024-07-26 06:04:57.100020] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:42.355 [2024-07-26 06:04:57.100084] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:42.355 [2024-07-26 06:04:57.226725] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:42.613 [2024-07-26 06:04:57.336228] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:36:42.613 [2024-07-26 06:04:57.336277] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:36:42.613 [2024-07-26 06:04:57.336291] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:36:42.613 [2024-07-26 06:04:57.336304] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:36:42.613 [2024-07-26 06:04:57.336315] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:36:42.613 [2024-07-26 06:04:57.336344] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:43.178 06:04:58 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:43.178 06:04:58 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:43.178 06:04:58 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:36:43.178 06:04:58 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:43.178 06:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:43.178 06:04:58 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:36:43.178 06:04:58 chaining -- bdev/chaining.sh@69 -- # mktemp 00:36:43.178 06:04:58 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.T6ti0eocTx 00:36:43.178 06:04:58 chaining -- bdev/chaining.sh@69 -- # mktemp 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.P1FA4doV33 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:43.436 malloc0 00:36:43.436 true 00:36:43.436 true 00:36:43.436 [2024-07-26 06:04:58.128981] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:43.436 crypto0 00:36:43.436 [2024-07-26 06:04:58.137008] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:43.436 crypto1 00:36:43.436 [2024-07-26 06:04:58.145132] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:36:43.436 [2024-07-26 06:04:58.161372] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@85 -- # update_stats 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:43.436 06:04:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:36:43.436 06:04:58 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:43.437 06:04:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:43.437 06:04:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:43.437 06:04:58 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:43.437 06:04:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:43.437 06:04:58 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:43.437 06:04:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:43.437 06:04:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:43.437 06:04:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.437 06:04:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:43.437 06:04:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.695 06:04:58 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:43.695 06:04:58 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.T6ti0eocTx bs=1K count=64 00:36:43.695 64+0 records in 00:36:43.695 64+0 records out 00:36:43.695 65536 bytes (66 kB, 64 KiB) copied, 0.00106776 s, 61.4 MB/s 00:36:43.695 06:04:58 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.T6ti0eocTx --ob Nvme0n1 --bs 65536 --count 1 00:36:43.695 06:04:58 chaining -- bdev/chaining.sh@25 -- # local config 00:36:43.695 06:04:58 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:43.695 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:43.695 06:04:58 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:43.695 06:04:58 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:43.695 "subsystems": [ 00:36:43.695 { 00:36:43.695 "subsystem": "bdev", 00:36:43.695 "config": [ 00:36:43.695 { 00:36:43.695 "method": "bdev_nvme_attach_controller", 00:36:43.695 "params": { 00:36:43.695 "trtype": "tcp", 00:36:43.695 "adrfam": "IPv4", 00:36:43.695 "name": "Nvme0", 00:36:43.695 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:43.695 "traddr": "10.0.0.2", 00:36:43.695 "trsvcid": "4420" 00:36:43.695 } 00:36:43.695 }, 00:36:43.695 { 00:36:43.695 "method": "bdev_set_options", 00:36:43.695 "params": { 00:36:43.695 "bdev_auto_examine": false 00:36:43.695 } 00:36:43.695 } 00:36:43.695 ] 00:36:43.695 } 00:36:43.695 ] 00:36:43.695 }' 00:36:43.695 06:04:58 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.T6ti0eocTx --ob Nvme0n1 --bs 65536 --count 1 00:36:43.695 06:04:58 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:43.695 "subsystems": [ 00:36:43.695 { 00:36:43.695 "subsystem": "bdev", 00:36:43.695 "config": [ 00:36:43.695 { 00:36:43.695 "method": "bdev_nvme_attach_controller", 00:36:43.695 "params": { 00:36:43.695 "trtype": "tcp", 00:36:43.695 "adrfam": "IPv4", 00:36:43.695 "name": "Nvme0", 00:36:43.695 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:43.695 "traddr": "10.0.0.2", 00:36:43.695 "trsvcid": "4420" 00:36:43.695 } 00:36:43.695 }, 00:36:43.695 { 00:36:43.695 "method": "bdev_set_options", 00:36:43.695 "params": { 00:36:43.695 "bdev_auto_examine": false 00:36:43.695 } 00:36:43.695 } 00:36:43.695 ] 00:36:43.695 } 00:36:43.695 ] 00:36:43.695 }' 00:36:43.695 [2024-07-26 06:04:58.475538] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:43.695 [2024-07-26 06:04:58.475606] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1337900 ] 00:36:43.953 [2024-07-26 06:04:58.606212] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:43.953 [2024-07-26 06:04:58.707094] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:44.468  Copying: 64/64 [kB] (average 20 MBps) 00:36:44.468 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:44.468 06:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:44.468 06:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:44.468 06:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:44.468 06:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:44.468 06:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:44.468 06:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:44.468 06:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:44.469 06:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:44.469 06:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:44.469 06:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:44.469 06:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:44.469 06:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:44.469 06:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@96 -- # update_stats 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:44.469 06:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:44.469 06:04:59 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:44.469 06:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:44.469 06:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:44.727 06:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:44.727 06:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:44.727 06:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:44.727 06:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:44.727 06:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:44.727 06:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:44.727 06:04:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:44.727 06:04:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:44.727 06:04:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.P1FA4doV33 --ib Nvme0n1 --bs 65536 --count 1 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@25 -- # local config 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:44.727 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:44.727 "subsystems": [ 00:36:44.727 { 00:36:44.727 "subsystem": "bdev", 00:36:44.727 "config": [ 00:36:44.727 { 00:36:44.727 "method": "bdev_nvme_attach_controller", 00:36:44.727 "params": { 00:36:44.727 "trtype": "tcp", 00:36:44.727 "adrfam": "IPv4", 00:36:44.727 "name": "Nvme0", 00:36:44.727 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:44.727 "traddr": "10.0.0.2", 00:36:44.727 "trsvcid": "4420" 00:36:44.727 } 00:36:44.727 }, 00:36:44.727 { 00:36:44.727 "method": "bdev_set_options", 00:36:44.727 "params": { 00:36:44.727 "bdev_auto_examine": false 00:36:44.727 } 00:36:44.727 } 00:36:44.727 ] 00:36:44.727 } 00:36:44.727 ] 00:36:44.727 }' 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:44.727 "subsystems": [ 00:36:44.727 { 00:36:44.727 "subsystem": "bdev", 00:36:44.727 "config": [ 00:36:44.727 { 00:36:44.727 "method": "bdev_nvme_attach_controller", 00:36:44.727 "params": { 00:36:44.727 "trtype": "tcp", 00:36:44.727 "adrfam": "IPv4", 00:36:44.727 "name": "Nvme0", 00:36:44.727 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:44.727 "traddr": "10.0.0.2", 00:36:44.727 "trsvcid": "4420" 00:36:44.727 } 00:36:44.727 }, 00:36:44.727 { 00:36:44.727 "method": "bdev_set_options", 00:36:44.727 "params": { 00:36:44.727 "bdev_auto_examine": false 00:36:44.727 } 00:36:44.727 } 00:36:44.727 ] 00:36:44.727 } 00:36:44.727 ] 00:36:44.727 }' 00:36:44.727 06:04:59 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.P1FA4doV33 --ib Nvme0n1 --bs 65536 --count 1 00:36:44.985 [2024-07-26 06:04:59.654290] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:44.985 [2024-07-26 06:04:59.654356] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338108 ] 00:36:44.985 [2024-07-26 06:04:59.783422] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:44.985 [2024-07-26 06:04:59.880218] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:45.500  Copying: 64/64 [kB] (average 12 MBps) 00:36:45.500 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:45.500 06:05:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:45.500 06:05:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:45.500 06:05:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:45.500 06:05:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:45.500 06:05:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:45.500 06:05:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:45.500 06:05:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:45.500 06:05:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:45.500 06:05:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:45.500 06:05:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:45.758 06:05:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:45.758 06:05:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:45.758 06:05:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.T6ti0eocTx /tmp/tmp.P1FA4doV33 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@25 -- # local config 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:45.758 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:45.758 "subsystems": [ 00:36:45.758 { 00:36:45.758 "subsystem": "bdev", 00:36:45.758 "config": [ 00:36:45.758 { 00:36:45.758 "method": "bdev_nvme_attach_controller", 00:36:45.758 "params": { 00:36:45.758 "trtype": "tcp", 00:36:45.758 "adrfam": "IPv4", 00:36:45.758 "name": "Nvme0", 00:36:45.758 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:45.758 "traddr": "10.0.0.2", 00:36:45.758 "trsvcid": "4420" 00:36:45.758 } 00:36:45.758 }, 00:36:45.758 { 00:36:45.758 "method": "bdev_set_options", 00:36:45.758 "params": { 00:36:45.758 "bdev_auto_examine": false 00:36:45.758 } 00:36:45.758 } 00:36:45.758 ] 00:36:45.758 } 00:36:45.758 ] 00:36:45.758 }' 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:36:45.758 06:05:00 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:45.758 "subsystems": [ 00:36:45.758 { 00:36:45.758 "subsystem": "bdev", 00:36:45.758 "config": [ 00:36:45.758 { 00:36:45.758 "method": "bdev_nvme_attach_controller", 00:36:45.758 "params": { 00:36:45.758 "trtype": "tcp", 00:36:45.758 "adrfam": "IPv4", 00:36:45.758 "name": "Nvme0", 00:36:45.758 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:45.758 "traddr": "10.0.0.2", 00:36:45.758 "trsvcid": "4420" 00:36:45.758 } 00:36:45.758 }, 00:36:45.758 { 00:36:45.758 "method": "bdev_set_options", 00:36:45.758 "params": { 00:36:45.758 "bdev_auto_examine": false 00:36:45.758 } 00:36:45.758 } 00:36:45.758 ] 00:36:45.758 } 00:36:45.758 ] 00:36:45.759 }' 00:36:45.759 [2024-07-26 06:05:00.577825] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:45.759 [2024-07-26 06:05:00.577899] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338144 ] 00:36:46.016 [2024-07-26 06:05:00.707880] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:46.016 [2024-07-26 06:05:00.810032] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:46.532  Copying: 64/64 [kB] (average 31 MBps) 00:36:46.532 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@106 -- # update_stats 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:46.532 06:05:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:46.532 06:05:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:46.790 06:05:01 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:46.790 06:05:01 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.T6ti0eocTx --ob Nvme0n1 --bs 4096 --count 16 00:36:46.790 06:05:01 chaining -- bdev/chaining.sh@25 -- # local config 00:36:46.790 06:05:01 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:46.790 06:05:01 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:46.790 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:46.790 06:05:01 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:46.790 "subsystems": [ 00:36:46.790 { 00:36:46.790 "subsystem": "bdev", 00:36:46.790 "config": [ 00:36:46.790 { 00:36:46.790 "method": "bdev_nvme_attach_controller", 00:36:46.790 "params": { 00:36:46.790 "trtype": "tcp", 00:36:46.790 "adrfam": "IPv4", 00:36:46.790 "name": "Nvme0", 00:36:46.790 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:46.790 "traddr": "10.0.0.2", 00:36:46.790 "trsvcid": "4420" 00:36:46.790 } 00:36:46.790 }, 00:36:46.790 { 00:36:46.790 "method": "bdev_set_options", 00:36:46.790 "params": { 00:36:46.790 "bdev_auto_examine": false 00:36:46.790 } 00:36:46.790 } 00:36:46.790 ] 00:36:46.790 } 00:36:46.790 ] 00:36:46.790 }' 00:36:46.790 06:05:01 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.T6ti0eocTx --ob Nvme0n1 --bs 4096 --count 16 00:36:46.790 06:05:01 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:46.790 "subsystems": [ 00:36:46.790 { 00:36:46.790 "subsystem": "bdev", 00:36:46.790 "config": [ 00:36:46.790 { 00:36:46.790 "method": "bdev_nvme_attach_controller", 00:36:46.790 "params": { 00:36:46.790 "trtype": "tcp", 00:36:46.790 "adrfam": "IPv4", 00:36:46.790 "name": "Nvme0", 00:36:46.790 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:46.790 "traddr": "10.0.0.2", 00:36:46.790 "trsvcid": "4420" 00:36:46.790 } 00:36:46.790 }, 00:36:46.790 { 00:36:46.790 "method": "bdev_set_options", 00:36:46.790 "params": { 00:36:46.790 "bdev_auto_examine": false 00:36:46.790 } 00:36:46.790 } 00:36:46.790 ] 00:36:46.790 } 00:36:46.790 ] 00:36:46.790 }' 00:36:46.790 [2024-07-26 06:05:01.571563] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:46.790 [2024-07-26 06:05:01.571632] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338339 ] 00:36:47.048 [2024-07-26 06:05:01.701832] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:47.048 [2024-07-26 06:05:01.798344] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:47.306  Copying: 64/64 [kB] (average 15 MBps) 00:36:47.306 00:36:47.306 06:05:02 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:36:47.306 06:05:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:47.306 06:05:02 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:47.306 06:05:02 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:47.306 06:05:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:47.306 06:05:02 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:47.563 06:05:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:47.563 06:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:47.563 06:05:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:47.563 06:05:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:47.563 06:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:47.563 06:05:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:36:47.563 06:05:02 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@114 -- # update_stats 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:47.564 06:05:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:47.564 06:05:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:47.822 06:05:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:47.822 06:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:47.822 06:05:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:47.822 06:05:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:47.822 06:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:47.822 06:05:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@117 -- # : 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.P1FA4doV33 --ib Nvme0n1 --bs 4096 --count 16 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@25 -- # local config 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:47.822 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:47.822 "subsystems": [ 00:36:47.822 { 00:36:47.822 "subsystem": "bdev", 00:36:47.822 "config": [ 00:36:47.822 { 00:36:47.822 "method": "bdev_nvme_attach_controller", 00:36:47.822 "params": { 00:36:47.822 "trtype": "tcp", 00:36:47.822 "adrfam": "IPv4", 00:36:47.822 "name": "Nvme0", 00:36:47.822 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:47.822 "traddr": "10.0.0.2", 00:36:47.822 "trsvcid": "4420" 00:36:47.822 } 00:36:47.822 }, 00:36:47.822 { 00:36:47.822 "method": "bdev_set_options", 00:36:47.822 "params": { 00:36:47.822 "bdev_auto_examine": false 00:36:47.822 } 00:36:47.822 } 00:36:47.822 ] 00:36:47.822 } 00:36:47.822 ] 00:36:47.822 }' 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.P1FA4doV33 --ib Nvme0n1 --bs 4096 --count 16 00:36:47.822 06:05:02 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:47.822 "subsystems": [ 00:36:47.822 { 00:36:47.822 "subsystem": "bdev", 00:36:47.822 "config": [ 00:36:47.822 { 00:36:47.822 "method": "bdev_nvme_attach_controller", 00:36:47.822 "params": { 00:36:47.822 "trtype": "tcp", 00:36:47.822 "adrfam": "IPv4", 00:36:47.822 "name": "Nvme0", 00:36:47.822 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:47.822 "traddr": "10.0.0.2", 00:36:47.822 "trsvcid": "4420" 00:36:47.822 } 00:36:47.822 }, 00:36:47.822 { 00:36:47.822 "method": "bdev_set_options", 00:36:47.822 "params": { 00:36:47.822 "bdev_auto_examine": false 00:36:47.822 } 00:36:47.822 } 00:36:47.822 ] 00:36:47.822 } 00:36:47.822 ] 00:36:47.822 }' 00:36:47.822 [2024-07-26 06:05:02.695176] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:47.822 [2024-07-26 06:05:02.695244] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338547 ] 00:36:48.080 [2024-07-26 06:05:02.825884] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:48.080 [2024-07-26 06:05:02.923562] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:48.600  Copying: 64/64 [kB] (average 1391 kBps) 00:36:48.600 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:48.600 06:05:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:48.600 06:05:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:48.600 06:05:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:48.600 06:05:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:48.600 06:05:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:48.600 06:05:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:48.600 06:05:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:48.600 06:05:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:48.600 06:05:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:48.600 06:05:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.T6ti0eocTx /tmp/tmp.P1FA4doV33 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.T6ti0eocTx /tmp/tmp.P1FA4doV33 00:36:48.857 06:05:03 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@117 -- # sync 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@120 -- # set +e 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:48.857 rmmod nvme_tcp 00:36:48.857 rmmod nvme_fabrics 00:36:48.857 rmmod nvme_keyring 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@124 -- # set -e 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@125 -- # return 0 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@489 -- # '[' -n 1337680 ']' 00:36:48.857 06:05:03 chaining -- nvmf/common.sh@490 -- # killprocess 1337680 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@948 -- # '[' -z 1337680 ']' 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@952 -- # kill -0 1337680 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@953 -- # uname 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1337680 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1337680' 00:36:48.857 killing process with pid 1337680 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@967 -- # kill 1337680 00:36:48.857 06:05:03 chaining -- common/autotest_common.sh@972 -- # wait 1337680 00:36:49.144 06:05:03 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:49.144 06:05:03 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:49.144 06:05:03 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:49.144 06:05:03 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:49.144 06:05:03 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:49.144 06:05:03 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:49.144 06:05:03 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:49.144 06:05:03 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:49.144 06:05:04 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:49.403 06:05:04 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:36:49.403 06:05:04 chaining -- bdev/chaining.sh@132 -- # bperfpid=1338755 00:36:49.403 06:05:04 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:49.403 06:05:04 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1338755 00:36:49.403 06:05:04 chaining -- common/autotest_common.sh@829 -- # '[' -z 1338755 ']' 00:36:49.403 06:05:04 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:49.403 06:05:04 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:49.403 06:05:04 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:49.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:49.403 06:05:04 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:49.403 06:05:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:49.403 [2024-07-26 06:05:04.098014] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:49.403 [2024-07-26 06:05:04.098081] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1338755 ] 00:36:49.403 [2024-07-26 06:05:04.229164] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:49.661 [2024-07-26 06:05:04.336700] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:50.228 06:05:05 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:50.228 06:05:05 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:50.228 06:05:05 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:36:50.228 06:05:05 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:50.228 06:05:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:50.487 malloc0 00:36:50.487 true 00:36:50.487 true 00:36:50.487 [2024-07-26 06:05:05.173091] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:50.487 crypto0 00:36:50.487 [2024-07-26 06:05:05.181117] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:50.487 crypto1 00:36:50.487 06:05:05 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:50.487 06:05:05 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:36:50.487 Running I/O for 5 seconds... 00:36:55.751 00:36:55.751 Latency(us) 00:36:55.751 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:55.751 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:55.751 Verification LBA range: start 0x0 length 0x2000 00:36:55.751 crypto1 : 5.01 11441.20 44.69 0.00 0.00 22314.05 6411.13 15158.76 00:36:55.751 =================================================================================================================== 00:36:55.751 Total : 11441.20 44.69 0.00 0.00 22314.05 6411.13 15158.76 00:36:55.751 0 00:36:55.751 06:05:10 chaining -- bdev/chaining.sh@146 -- # killprocess 1338755 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@948 -- # '[' -z 1338755 ']' 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@952 -- # kill -0 1338755 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@953 -- # uname 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1338755 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1338755' 00:36:55.751 killing process with pid 1338755 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@967 -- # kill 1338755 00:36:55.751 Received shutdown signal, test time was about 5.000000 seconds 00:36:55.751 00:36:55.751 Latency(us) 00:36:55.751 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:55.751 =================================================================================================================== 00:36:55.751 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@972 -- # wait 1338755 00:36:55.751 06:05:10 chaining -- bdev/chaining.sh@152 -- # bperfpid=1339596 00:36:55.751 06:05:10 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:55.751 06:05:10 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1339596 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@829 -- # '[' -z 1339596 ']' 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:55.751 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:55.751 06:05:10 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:56.009 [2024-07-26 06:05:10.664050] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:36:56.009 [2024-07-26 06:05:10.664116] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1339596 ] 00:36:56.009 [2024-07-26 06:05:10.794517] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:56.009 [2024-07-26 06:05:10.902374] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:56.941 06:05:11 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:56.941 06:05:11 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:56.941 06:05:11 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:36:56.941 06:05:11 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:56.941 06:05:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:56.941 malloc0 00:36:56.941 true 00:36:56.941 true 00:36:56.941 [2024-07-26 06:05:11.741398] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:36:56.941 [2024-07-26 06:05:11.741445] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:56.941 [2024-07-26 06:05:11.741467] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x271b730 00:36:56.941 [2024-07-26 06:05:11.741480] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:56.941 [2024-07-26 06:05:11.742559] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:56.941 [2024-07-26 06:05:11.742585] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:36:56.941 pt0 00:36:56.941 [2024-07-26 06:05:11.749429] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:56.941 crypto0 00:36:56.941 [2024-07-26 06:05:11.757449] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:56.941 crypto1 00:36:56.941 06:05:11 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:56.941 06:05:11 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:36:57.198 Running I/O for 5 seconds... 00:37:02.453 00:37:02.453 Latency(us) 00:37:02.453 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:02.453 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:37:02.453 Verification LBA range: start 0x0 length 0x2000 00:37:02.453 crypto1 : 5.02 8931.61 34.89 0.00 0.00 28584.33 6525.11 17210.32 00:37:02.453 =================================================================================================================== 00:37:02.453 Total : 8931.61 34.89 0.00 0.00 28584.33 6525.11 17210.32 00:37:02.453 0 00:37:02.453 06:05:16 chaining -- bdev/chaining.sh@167 -- # killprocess 1339596 00:37:02.453 06:05:16 chaining -- common/autotest_common.sh@948 -- # '[' -z 1339596 ']' 00:37:02.453 06:05:16 chaining -- common/autotest_common.sh@952 -- # kill -0 1339596 00:37:02.453 06:05:16 chaining -- common/autotest_common.sh@953 -- # uname 00:37:02.453 06:05:16 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:02.453 06:05:16 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1339596 00:37:02.453 06:05:16 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:37:02.453 06:05:16 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:37:02.453 06:05:16 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1339596' 00:37:02.453 killing process with pid 1339596 00:37:02.453 06:05:16 chaining -- common/autotest_common.sh@967 -- # kill 1339596 00:37:02.453 Received shutdown signal, test time was about 5.000000 seconds 00:37:02.453 00:37:02.454 Latency(us) 00:37:02.454 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:02.454 =================================================================================================================== 00:37:02.454 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:02.454 06:05:16 chaining -- common/autotest_common.sh@972 -- # wait 1339596 00:37:02.454 06:05:17 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:37:02.454 06:05:17 chaining -- bdev/chaining.sh@170 -- # killprocess 1339596 00:37:02.454 06:05:17 chaining -- common/autotest_common.sh@948 -- # '[' -z 1339596 ']' 00:37:02.454 06:05:17 chaining -- common/autotest_common.sh@952 -- # kill -0 1339596 00:37:02.454 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (1339596) - No such process 00:37:02.454 06:05:17 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 1339596 is not found' 00:37:02.454 Process with pid 1339596 is not found 00:37:02.454 06:05:17 chaining -- bdev/chaining.sh@171 -- # wait 1339596 00:37:02.454 06:05:17 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:37:02.454 06:05:17 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:37:02.454 06:05:17 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:37:02.454 06:05:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@296 -- # e810=() 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@297 -- # x722=() 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@298 -- # mlx=() 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@336 -- # return 1 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:37:02.454 WARNING: No supported devices were found, fallback requested for tcp test 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:37:02.454 Cannot find device "nvmf_tgt_br" 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@155 -- # true 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:37:02.454 Cannot find device "nvmf_tgt_br2" 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@156 -- # true 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:37:02.454 Cannot find device "nvmf_tgt_br" 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@158 -- # true 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:37:02.454 Cannot find device "nvmf_tgt_br2" 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@159 -- # true 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:37:02.454 06:05:17 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:37:02.711 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@162 -- # true 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:37:02.711 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@163 -- # true 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:37:02.711 06:05:17 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:37:02.970 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:37:02.970 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.102 ms 00:37:02.970 00:37:02.970 --- 10.0.0.2 ping statistics --- 00:37:02.970 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:37:02.970 rtt min/avg/max/mdev = 0.102/0.102/0.102/0.000 ms 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:37:02.970 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:37:02.970 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.075 ms 00:37:02.970 00:37:02.970 --- 10.0.0.3 ping statistics --- 00:37:02.970 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:37:02.970 rtt min/avg/max/mdev = 0.075/0.075/0.075/0.000 ms 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:37:02.970 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:37:02.970 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.030 ms 00:37:02.970 00:37:02.970 --- 10.0.0.1 ping statistics --- 00:37:02.970 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:37:02.970 rtt min/avg/max/mdev = 0.030/0.030/0.030/0.000 ms 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@433 -- # return 0 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:37:02.970 06:05:17 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:37:02.970 06:05:17 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:37:02.970 06:05:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@481 -- # nvmfpid=1340774 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@482 -- # waitforlisten 1340774 00:37:02.970 06:05:17 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:37:02.970 06:05:17 chaining -- common/autotest_common.sh@829 -- # '[' -z 1340774 ']' 00:37:02.970 06:05:17 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:37:02.970 06:05:17 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:02.970 06:05:17 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:37:02.970 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:37:02.970 06:05:17 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:02.970 06:05:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:02.970 [2024-07-26 06:05:17.854491] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:37:02.970 [2024-07-26 06:05:17.854556] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:37:03.228 [2024-07-26 06:05:17.980991] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:03.228 [2024-07-26 06:05:18.084856] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:37:03.228 [2024-07-26 06:05:18.084905] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:37:03.228 [2024-07-26 06:05:18.084919] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:37:03.228 [2024-07-26 06:05:18.084939] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:37:03.228 [2024-07-26 06:05:18.084950] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:37:03.228 [2024-07-26 06:05:18.084988] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@862 -- # return 0 00:37:04.160 06:05:18 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:04.160 06:05:18 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:37:04.160 06:05:18 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:04.160 malloc0 00:37:04.160 [2024-07-26 06:05:18.837819] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:37:04.160 [2024-07-26 06:05:18.854024] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:37:04.160 06:05:18 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:37:04.160 06:05:18 chaining -- bdev/chaining.sh@189 -- # bperfpid=1340868 00:37:04.160 06:05:18 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1340868 /var/tmp/bperf.sock 00:37:04.160 06:05:18 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@829 -- # '[' -z 1340868 ']' 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:37:04.160 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:04.160 06:05:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:04.160 [2024-07-26 06:05:18.926596] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:37:04.160 [2024-07-26 06:05:18.926670] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1340868 ] 00:37:04.160 [2024-07-26 06:05:19.046062] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:04.417 [2024-07-26 06:05:19.150013] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:04.980 06:05:19 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:04.980 06:05:19 chaining -- common/autotest_common.sh@862 -- # return 0 00:37:04.980 06:05:19 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:37:04.980 06:05:19 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:37:05.542 [2024-07-26 06:05:20.246711] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:37:05.542 nvme0n1 00:37:05.542 true 00:37:05.542 crypto0 00:37:05.542 06:05:20 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:37:05.542 Running I/O for 5 seconds... 00:37:10.794 00:37:10.794 Latency(us) 00:37:10.794 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:10.794 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:37:10.794 Verification LBA range: start 0x0 length 0x2000 00:37:10.794 crypto0 : 5.02 8193.84 32.01 0.00 0.00 31139.80 1923.34 28038.01 00:37:10.794 =================================================================================================================== 00:37:10.794 Total : 8193.84 32.01 0.00 0.00 31139.80 1923.34 28038.01 00:37:10.794 0 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:10.794 06:05:25 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@205 -- # sequence=82276 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:37:11.053 06:05:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@206 -- # encrypt=41138 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:11.311 06:05:25 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@207 -- # decrypt=41138 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@208 -- # crc32c=82276 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:37:11.601 06:05:26 chaining -- bdev/chaining.sh@214 -- # killprocess 1340868 00:37:11.601 06:05:26 chaining -- common/autotest_common.sh@948 -- # '[' -z 1340868 ']' 00:37:11.601 06:05:26 chaining -- common/autotest_common.sh@952 -- # kill -0 1340868 00:37:11.601 06:05:26 chaining -- common/autotest_common.sh@953 -- # uname 00:37:11.601 06:05:26 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:11.601 06:05:26 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1340868 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1340868' 00:37:11.860 killing process with pid 1340868 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@967 -- # kill 1340868 00:37:11.860 Received shutdown signal, test time was about 5.000000 seconds 00:37:11.860 00:37:11.860 Latency(us) 00:37:11.860 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:11.860 =================================================================================================================== 00:37:11.860 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@972 -- # wait 1340868 00:37:11.860 06:05:26 chaining -- bdev/chaining.sh@219 -- # bperfpid=1341869 00:37:11.860 06:05:26 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:37:11.860 06:05:26 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1341869 /var/tmp/bperf.sock 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@829 -- # '[' -z 1341869 ']' 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:37:11.860 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:37:11.860 06:05:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:12.119 [2024-07-26 06:05:26.791554] Starting SPDK v24.09-pre git sha1 f6e944e96 / DPDK 24.03.0 initialization... 00:37:12.119 [2024-07-26 06:05:26.791625] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1341869 ] 00:37:12.119 [2024-07-26 06:05:26.916607] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:12.119 [2024-07-26 06:05:27.014465] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:13.051 06:05:27 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:37:13.052 06:05:27 chaining -- common/autotest_common.sh@862 -- # return 0 00:37:13.052 06:05:27 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:37:13.052 06:05:27 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:37:13.310 [2024-07-26 06:05:28.126685] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:37:13.310 nvme0n1 00:37:13.310 true 00:37:13.310 crypto0 00:37:13.310 06:05:28 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:37:13.568 Running I/O for 5 seconds... 00:37:18.833 00:37:18.833 Latency(us) 00:37:18.833 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:18.833 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:37:18.833 Verification LBA range: start 0x0 length 0x200 00:37:18.833 crypto0 : 5.01 1684.29 105.27 0.00 0.00 18621.49 1218.11 18692.01 00:37:18.833 =================================================================================================================== 00:37:18.833 Total : 1684.29 105.27 0.00 0.00 18621.49 1218.11 18692.01 00:37:18.833 0 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@39 -- # opcode= 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@233 -- # sequence=16864 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:37:18.833 06:05:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@234 -- # encrypt=8432 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:37:19.091 06:05:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@235 -- # decrypt=8432 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:37:19.348 06:05:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:37:19.607 06:05:34 chaining -- bdev/chaining.sh@236 -- # crc32c=16864 00:37:19.607 06:05:34 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:37:19.607 06:05:34 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:37:19.607 06:05:34 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:37:19.607 06:05:34 chaining -- bdev/chaining.sh@242 -- # killprocess 1341869 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 1341869 ']' 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@952 -- # kill -0 1341869 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@953 -- # uname 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1341869 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1341869' 00:37:19.607 killing process with pid 1341869 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@967 -- # kill 1341869 00:37:19.607 Received shutdown signal, test time was about 5.000000 seconds 00:37:19.607 00:37:19.607 Latency(us) 00:37:19.607 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:19.607 =================================================================================================================== 00:37:19.607 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:37:19.607 06:05:34 chaining -- common/autotest_common.sh@972 -- # wait 1341869 00:37:19.865 06:05:34 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@117 -- # sync 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@120 -- # set +e 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:37:19.865 rmmod nvme_tcp 00:37:19.865 rmmod nvme_fabrics 00:37:19.865 rmmod nvme_keyring 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@124 -- # set -e 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@125 -- # return 0 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@489 -- # '[' -n 1340774 ']' 00:37:19.865 06:05:34 chaining -- nvmf/common.sh@490 -- # killprocess 1340774 00:37:19.865 06:05:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 1340774 ']' 00:37:19.865 06:05:34 chaining -- common/autotest_common.sh@952 -- # kill -0 1340774 00:37:19.865 06:05:34 chaining -- common/autotest_common.sh@953 -- # uname 00:37:19.865 06:05:34 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:19.865 06:05:34 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1340774 00:37:19.865 06:05:34 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:37:19.865 06:05:34 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:37:19.865 06:05:34 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1340774' 00:37:19.866 killing process with pid 1340774 00:37:19.866 06:05:34 chaining -- common/autotest_common.sh@967 -- # kill 1340774 00:37:19.866 06:05:34 chaining -- common/autotest_common.sh@972 -- # wait 1340774 00:37:20.124 06:05:34 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:37:20.124 06:05:34 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:37:20.124 06:05:34 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:37:20.124 06:05:34 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:37:20.124 06:05:34 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:37:20.124 06:05:34 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:37:20.124 06:05:34 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:37:20.124 06:05:34 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:37:20.124 06:05:34 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:37:20.124 06:05:34 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:37:20.124 00:37:20.124 real 0m46.455s 00:37:20.124 user 1m0.028s 00:37:20.124 sys 0m13.498s 00:37:20.124 06:05:34 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:20.124 06:05:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:37:20.124 ************************************ 00:37:20.124 END TEST chaining 00:37:20.124 ************************************ 00:37:20.382 06:05:35 -- common/autotest_common.sh@1142 -- # return 0 00:37:20.382 06:05:35 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:37:20.382 06:05:35 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:37:20.382 06:05:35 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:37:20.382 06:05:35 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:37:20.382 06:05:35 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:37:20.382 06:05:35 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:37:20.382 06:05:35 -- common/autotest_common.sh@722 -- # xtrace_disable 00:37:20.382 06:05:35 -- common/autotest_common.sh@10 -- # set +x 00:37:20.382 06:05:35 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:37:20.382 06:05:35 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:37:20.382 06:05:35 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:37:20.382 06:05:35 -- common/autotest_common.sh@10 -- # set +x 00:37:25.648 INFO: APP EXITING 00:37:25.648 INFO: killing all VMs 00:37:25.648 INFO: killing vhost app 00:37:25.648 INFO: EXIT DONE 00:37:28.208 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:37:28.208 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:37:28.208 Waiting for block devices as requested 00:37:28.208 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:37:28.466 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:37:28.466 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:37:28.725 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:37:28.725 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:37:28.725 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:37:28.983 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:37:28.983 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:37:28.983 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:37:29.241 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:37:29.241 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:37:29.241 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:37:29.499 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:37:29.499 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:37:29.499 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:37:29.757 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:37:29.757 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:37:33.938 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:37:33.938 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:37:33.938 Cleaning 00:37:33.938 Removing: /var/run/dpdk/spdk0/config 00:37:33.938 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:37:33.938 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:37:33.938 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:37:33.938 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:37:33.938 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:37:33.938 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:37:33.938 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:37:33.938 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:37:33.938 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:37:33.938 Removing: /var/run/dpdk/spdk0/hugepage_info 00:37:33.938 Removing: /dev/shm/nvmf_trace.0 00:37:33.938 Removing: /dev/shm/spdk_tgt_trace.pid1087241 00:37:33.938 Removing: /var/run/dpdk/spdk0 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1086396 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1087241 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1087771 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1088501 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1088689 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1089449 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1089628 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1089916 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1092472 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1093705 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1093994 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1094337 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1094597 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1094984 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1095187 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1095386 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1095613 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1096191 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1098886 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1099088 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1099323 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1099546 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1099725 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1099802 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1100097 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1100350 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1100547 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1100743 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1100944 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1101180 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1101495 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1101697 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1101889 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1102095 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1102287 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1102540 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1102837 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1103042 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1103237 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1103448 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1103653 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1103922 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1104212 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1104401 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1104604 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1104971 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1105339 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1105566 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1105911 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1106278 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1106576 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1106866 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1107138 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1107468 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1108176 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1108682 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1108880 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1112682 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1114448 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1116079 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1116976 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1118047 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1118408 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1118442 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1118495 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1122304 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1122808 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1123834 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1124065 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1129396 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1130971 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1131843 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1136590 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1138220 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1139025 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1143268 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1145610 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1146507 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1156244 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1158330 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1159310 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1169726 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1171932 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1172904 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1182652 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1186076 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1187057 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1198492 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1200924 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1201905 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1212720 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1215246 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1216918 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1227689 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1231506 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1232540 00:37:33.938 Removing: /var/run/dpdk/spdk_pid1233681 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1236730 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1241767 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1244609 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1249102 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1252540 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1257864 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1260580 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1266778 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1269651 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1275618 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1278033 00:37:33.939 Removing: /var/run/dpdk/spdk_pid1284030 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1286449 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1290600 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1290957 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1291313 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1291673 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1292111 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1292874 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1293586 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1293994 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1296093 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1297699 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1299299 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1300606 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1302216 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1303810 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1305348 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1306605 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1307253 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1307622 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1309624 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1311481 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1313329 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1314386 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1315613 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1316159 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1316189 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1316417 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1316618 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1316809 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1317875 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1319508 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1321386 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1322173 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1322983 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1323186 00:37:34.196 Removing: /var/run/dpdk/spdk_pid1323318 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1323395 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1324342 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1324893 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1325263 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1327429 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1329255 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1330972 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1332031 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1333264 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1333804 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1333861 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1337900 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1338108 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1338144 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1338339 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1338547 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1338755 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1339596 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1340868 00:37:34.197 Removing: /var/run/dpdk/spdk_pid1341869 00:37:34.197 Clean 00:37:34.454 06:05:49 -- common/autotest_common.sh@1451 -- # return 0 00:37:34.454 06:05:49 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:37:34.454 06:05:49 -- common/autotest_common.sh@728 -- # xtrace_disable 00:37:34.454 06:05:49 -- common/autotest_common.sh@10 -- # set +x 00:37:34.454 06:05:49 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:37:34.454 06:05:49 -- common/autotest_common.sh@728 -- # xtrace_disable 00:37:34.454 06:05:49 -- common/autotest_common.sh@10 -- # set +x 00:37:34.454 06:05:49 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:37:34.454 06:05:49 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:37:34.454 06:05:49 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:37:34.454 06:05:49 -- spdk/autotest.sh@391 -- # hash lcov 00:37:34.454 06:05:49 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:37:34.454 06:05:49 -- spdk/autotest.sh@393 -- # hostname 00:37:34.454 06:05:49 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:37:34.712 geninfo: WARNING: invalid characters removed from testname! 00:38:06.803 06:06:16 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:38:06.803 06:06:20 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:38:08.181 06:06:23 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:38:11.467 06:06:25 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:38:13.999 06:06:28 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:38:16.533 06:06:30 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:38:19.068 06:06:33 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:38:19.068 06:06:33 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:38:19.068 06:06:33 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:38:19.068 06:06:33 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:38:19.068 06:06:33 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:38:19.068 06:06:33 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:38:19.068 06:06:33 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:38:19.068 06:06:33 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:38:19.068 06:06:33 -- paths/export.sh@5 -- $ export PATH 00:38:19.068 06:06:33 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:38:19.068 06:06:33 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:38:19.068 06:06:33 -- common/autobuild_common.sh@447 -- $ date +%s 00:38:19.068 06:06:33 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721966793.XXXXXX 00:38:19.068 06:06:33 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721966793.qYdRvh 00:38:19.068 06:06:33 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:38:19.068 06:06:33 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:38:19.068 06:06:33 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:38:19.068 06:06:33 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:38:19.068 06:06:33 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:38:19.068 06:06:33 -- common/autobuild_common.sh@463 -- $ get_config_params 00:38:19.068 06:06:33 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:38:19.068 06:06:33 -- common/autotest_common.sh@10 -- $ set +x 00:38:19.068 06:06:33 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:38:19.068 06:06:33 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:38:19.068 06:06:33 -- pm/common@17 -- $ local monitor 00:38:19.068 06:06:33 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:38:19.068 06:06:33 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:38:19.068 06:06:33 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:38:19.068 06:06:33 -- pm/common@21 -- $ date +%s 00:38:19.068 06:06:33 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:38:19.068 06:06:33 -- pm/common@21 -- $ date +%s 00:38:19.068 06:06:33 -- pm/common@25 -- $ sleep 1 00:38:19.068 06:06:33 -- pm/common@21 -- $ date +%s 00:38:19.068 06:06:33 -- pm/common@21 -- $ date +%s 00:38:19.068 06:06:33 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721966793 00:38:19.068 06:06:33 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721966793 00:38:19.068 06:06:33 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721966793 00:38:19.068 06:06:33 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721966793 00:38:19.068 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721966793_collect-vmstat.pm.log 00:38:19.068 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721966793_collect-cpu-load.pm.log 00:38:19.068 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721966793_collect-cpu-temp.pm.log 00:38:19.068 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721966793_collect-bmc-pm.bmc.pm.log 00:38:20.005 06:06:34 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:38:20.005 06:06:34 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:38:20.005 06:06:34 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:38:20.005 06:06:34 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:38:20.005 06:06:34 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:38:20.005 06:06:34 -- spdk/autopackage.sh@19 -- $ timing_finish 00:38:20.005 06:06:34 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:38:20.005 06:06:34 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:38:20.005 06:06:34 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:38:20.005 06:06:34 -- spdk/autopackage.sh@20 -- $ exit 0 00:38:20.005 06:06:34 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:38:20.005 06:06:34 -- pm/common@29 -- $ signal_monitor_resources TERM 00:38:20.005 06:06:34 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:38:20.005 06:06:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:38:20.005 06:06:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:38:20.005 06:06:34 -- pm/common@44 -- $ pid=1352955 00:38:20.005 06:06:34 -- pm/common@50 -- $ kill -TERM 1352955 00:38:20.005 06:06:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:38:20.005 06:06:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:38:20.005 06:06:34 -- pm/common@44 -- $ pid=1352957 00:38:20.005 06:06:34 -- pm/common@50 -- $ kill -TERM 1352957 00:38:20.005 06:06:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:38:20.005 06:06:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:38:20.005 06:06:34 -- pm/common@44 -- $ pid=1352959 00:38:20.005 06:06:34 -- pm/common@50 -- $ kill -TERM 1352959 00:38:20.005 06:06:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:38:20.005 06:06:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:38:20.005 06:06:34 -- pm/common@44 -- $ pid=1352999 00:38:20.005 06:06:34 -- pm/common@50 -- $ sudo -E kill -TERM 1352999 00:38:20.005 + [[ -n 971311 ]] 00:38:20.005 + sudo kill 971311 00:38:20.014 [Pipeline] } 00:38:20.031 [Pipeline] // stage 00:38:20.036 [Pipeline] } 00:38:20.052 [Pipeline] // timeout 00:38:20.057 [Pipeline] } 00:38:20.074 [Pipeline] // catchError 00:38:20.079 [Pipeline] } 00:38:20.096 [Pipeline] // wrap 00:38:20.103 [Pipeline] } 00:38:20.118 [Pipeline] // catchError 00:38:20.127 [Pipeline] stage 00:38:20.129 [Pipeline] { (Epilogue) 00:38:20.145 [Pipeline] catchError 00:38:20.147 [Pipeline] { 00:38:20.161 [Pipeline] echo 00:38:20.163 Cleanup processes 00:38:20.168 [Pipeline] sh 00:38:20.478 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:38:20.478 1353069 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:38:20.478 1353278 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:38:20.491 [Pipeline] sh 00:38:20.771 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:38:20.771 ++ grep -v 'sudo pgrep' 00:38:20.771 ++ awk '{print $1}' 00:38:20.771 + sudo kill -9 1353069 00:38:20.781 [Pipeline] sh 00:38:21.060 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:38:33.267 [Pipeline] sh 00:38:33.547 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:38:33.547 Artifacts sizes are good 00:38:33.558 [Pipeline] archiveArtifacts 00:38:33.562 Archiving artifacts 00:38:33.711 [Pipeline] sh 00:38:33.993 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:38:34.007 [Pipeline] cleanWs 00:38:34.016 [WS-CLEANUP] Deleting project workspace... 00:38:34.016 [WS-CLEANUP] Deferred wipeout is used... 00:38:34.023 [WS-CLEANUP] done 00:38:34.024 [Pipeline] } 00:38:34.044 [Pipeline] // catchError 00:38:34.056 [Pipeline] sh 00:38:34.335 + logger -p user.info -t JENKINS-CI 00:38:34.344 [Pipeline] } 00:38:34.360 [Pipeline] // stage 00:38:34.366 [Pipeline] } 00:38:34.383 [Pipeline] // node 00:38:34.390 [Pipeline] End of Pipeline 00:38:34.415 Finished: SUCCESS